Dec 06 09:00:29 localhost kernel: Linux version 5.14.0-645.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025
Dec 06 09:00:29 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 06 09:00:29 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 06 09:00:29 localhost kernel: BIOS-provided physical RAM map:
Dec 06 09:00:29 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 06 09:00:29 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 06 09:00:29 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 06 09:00:29 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec 06 09:00:29 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec 06 09:00:29 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 06 09:00:29 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 06 09:00:29 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec 06 09:00:29 localhost kernel: NX (Execute Disable) protection: active
Dec 06 09:00:29 localhost kernel: APIC: Static calls initialized
Dec 06 09:00:29 localhost kernel: SMBIOS 2.8 present.
Dec 06 09:00:29 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec 06 09:00:29 localhost kernel: Hypervisor detected: KVM
Dec 06 09:00:29 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 06 09:00:29 localhost kernel: kvm-clock: using sched offset of 3146583811 cycles
Dec 06 09:00:29 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 06 09:00:29 localhost kernel: tsc: Detected 2800.000 MHz processor
Dec 06 09:00:29 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Dec 06 09:00:29 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Dec 06 09:00:29 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec 06 09:00:29 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec 06 09:00:29 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 06 09:00:29 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec 06 09:00:29 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec 06 09:00:29 localhost kernel: Using GB pages for direct mapping
Dec 06 09:00:29 localhost kernel: RAMDISK: [mem 0x2d472000-0x32a30fff]
Dec 06 09:00:29 localhost kernel: ACPI: Early table checksum verification disabled
Dec 06 09:00:29 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec 06 09:00:29 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 06 09:00:29 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 06 09:00:29 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 06 09:00:29 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec 06 09:00:29 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 06 09:00:29 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 06 09:00:29 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec 06 09:00:29 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec 06 09:00:29 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec 06 09:00:29 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec 06 09:00:29 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec 06 09:00:29 localhost kernel: No NUMA configuration found
Dec 06 09:00:29 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec 06 09:00:29 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Dec 06 09:00:29 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec 06 09:00:29 localhost kernel: Zone ranges:
Dec 06 09:00:29 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 06 09:00:29 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 06 09:00:29 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec 06 09:00:29 localhost kernel:   Device   empty
Dec 06 09:00:29 localhost kernel: Movable zone start for each node
Dec 06 09:00:29 localhost kernel: Early memory node ranges
Dec 06 09:00:29 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 06 09:00:29 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec 06 09:00:29 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec 06 09:00:29 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec 06 09:00:29 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 06 09:00:29 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 06 09:00:29 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 06 09:00:29 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Dec 06 09:00:29 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 06 09:00:29 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 06 09:00:29 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 06 09:00:29 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 06 09:00:29 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 06 09:00:29 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 06 09:00:29 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 06 09:00:29 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 06 09:00:29 localhost kernel: TSC deadline timer available
Dec 06 09:00:29 localhost kernel: CPU topo: Max. logical packages:   8
Dec 06 09:00:29 localhost kernel: CPU topo: Max. logical dies:       8
Dec 06 09:00:29 localhost kernel: CPU topo: Max. dies per package:   1
Dec 06 09:00:29 localhost kernel: CPU topo: Max. threads per core:   1
Dec 06 09:00:29 localhost kernel: CPU topo: Num. cores per package:     1
Dec 06 09:00:29 localhost kernel: CPU topo: Num. threads per package:   1
Dec 06 09:00:29 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec 06 09:00:29 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec 06 09:00:29 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 06 09:00:29 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 06 09:00:29 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 06 09:00:29 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 06 09:00:29 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec 06 09:00:29 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec 06 09:00:29 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 06 09:00:29 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 06 09:00:29 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 06 09:00:29 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec 06 09:00:29 localhost kernel: Booting paravirtualized kernel on KVM
Dec 06 09:00:29 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 06 09:00:29 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec 06 09:00:29 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec 06 09:00:29 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Dec 06 09:00:29 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Dec 06 09:00:29 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Dec 06 09:00:29 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 06 09:00:29 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64", will be passed to user space.
Dec 06 09:00:29 localhost kernel: random: crng init done
Dec 06 09:00:29 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 06 09:00:29 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec 06 09:00:29 localhost kernel: Fallback order for Node 0: 0 
Dec 06 09:00:29 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec 06 09:00:29 localhost kernel: Policy zone: Normal
Dec 06 09:00:29 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 06 09:00:29 localhost kernel: software IO TLB: area num 8.
Dec 06 09:00:29 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec 06 09:00:29 localhost kernel: ftrace: allocating 49335 entries in 193 pages
Dec 06 09:00:29 localhost kernel: ftrace: allocated 193 pages with 3 groups
Dec 06 09:00:29 localhost kernel: Dynamic Preempt: voluntary
Dec 06 09:00:29 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 06 09:00:29 localhost kernel: rcu:         RCU event tracing is enabled.
Dec 06 09:00:29 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec 06 09:00:29 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Dec 06 09:00:29 localhost kernel:         Rude variant of Tasks RCU enabled.
Dec 06 09:00:29 localhost kernel:         Tracing variant of Tasks RCU enabled.
Dec 06 09:00:29 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 06 09:00:29 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec 06 09:00:29 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 06 09:00:29 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 06 09:00:29 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 06 09:00:29 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec 06 09:00:29 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 06 09:00:29 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 06 09:00:29 localhost kernel: Console: colour VGA+ 80x25
Dec 06 09:00:29 localhost kernel: printk: console [ttyS0] enabled
Dec 06 09:00:29 localhost kernel: ACPI: Core revision 20230331
Dec 06 09:00:29 localhost kernel: APIC: Switch to symmetric I/O mode setup
Dec 06 09:00:29 localhost kernel: x2apic enabled
Dec 06 09:00:29 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Dec 06 09:00:29 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 06 09:00:29 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Dec 06 09:00:29 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 06 09:00:29 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 06 09:00:29 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 06 09:00:29 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 06 09:00:29 localhost kernel: Spectre V2 : Mitigation: Retpolines
Dec 06 09:00:29 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec 06 09:00:29 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec 06 09:00:29 localhost kernel: RETBleed: Mitigation: untrained return thunk
Dec 06 09:00:29 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 06 09:00:29 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 06 09:00:29 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec 06 09:00:29 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec 06 09:00:29 localhost kernel: x86/bugs: return thunk changed
Dec 06 09:00:29 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec 06 09:00:29 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 06 09:00:29 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 06 09:00:29 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 06 09:00:29 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 06 09:00:29 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec 06 09:00:29 localhost kernel: Freeing SMP alternatives memory: 40K
Dec 06 09:00:29 localhost kernel: pid_max: default: 32768 minimum: 301
Dec 06 09:00:29 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec 06 09:00:29 localhost kernel: landlock: Up and running.
Dec 06 09:00:29 localhost kernel: Yama: becoming mindful.
Dec 06 09:00:29 localhost kernel: SELinux:  Initializing.
Dec 06 09:00:29 localhost kernel: LSM support for eBPF active
Dec 06 09:00:29 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 06 09:00:29 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 06 09:00:29 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec 06 09:00:29 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 06 09:00:29 localhost kernel: ... version:                0
Dec 06 09:00:29 localhost kernel: ... bit width:              48
Dec 06 09:00:29 localhost kernel: ... generic registers:      6
Dec 06 09:00:29 localhost kernel: ... value mask:             0000ffffffffffff
Dec 06 09:00:29 localhost kernel: ... max period:             00007fffffffffff
Dec 06 09:00:29 localhost kernel: ... fixed-purpose events:   0
Dec 06 09:00:29 localhost kernel: ... event mask:             000000000000003f
Dec 06 09:00:29 localhost kernel: signal: max sigframe size: 1776
Dec 06 09:00:29 localhost kernel: rcu: Hierarchical SRCU implementation.
Dec 06 09:00:29 localhost kernel: rcu:         Max phase no-delay instances is 400.
Dec 06 09:00:29 localhost kernel: smp: Bringing up secondary CPUs ...
Dec 06 09:00:29 localhost kernel: smpboot: x86: Booting SMP configuration:
Dec 06 09:00:29 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec 06 09:00:29 localhost kernel: smp: Brought up 1 node, 8 CPUs
Dec 06 09:00:29 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Dec 06 09:00:29 localhost kernel: node 0 deferred pages initialised in 10ms
Dec 06 09:00:29 localhost kernel: Memory: 7763996K/8388068K available (16384K kernel code, 5795K rwdata, 13908K rodata, 4196K init, 7156K bss, 618212K reserved, 0K cma-reserved)
Dec 06 09:00:29 localhost kernel: devtmpfs: initialized
Dec 06 09:00:29 localhost kernel: x86/mm: Memory block size: 128MB
Dec 06 09:00:29 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 06 09:00:29 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Dec 06 09:00:29 localhost kernel: pinctrl core: initialized pinctrl subsystem
Dec 06 09:00:29 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 06 09:00:29 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec 06 09:00:29 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 06 09:00:29 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 06 09:00:29 localhost kernel: audit: initializing netlink subsys (disabled)
Dec 06 09:00:29 localhost kernel: audit: type=2000 audit(1765011627.390:1): state=initialized audit_enabled=0 res=1
Dec 06 09:00:29 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 06 09:00:29 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 06 09:00:29 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 06 09:00:29 localhost kernel: cpuidle: using governor menu
Dec 06 09:00:29 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 06 09:00:29 localhost kernel: PCI: Using configuration type 1 for base access
Dec 06 09:00:29 localhost kernel: PCI: Using configuration type 1 for extended access
Dec 06 09:00:29 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 06 09:00:29 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec 06 09:00:29 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec 06 09:00:29 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec 06 09:00:29 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec 06 09:00:29 localhost kernel: Demotion targets for Node 0: null
Dec 06 09:00:29 localhost kernel: cryptd: max_cpu_qlen set to 1000
Dec 06 09:00:29 localhost kernel: ACPI: Added _OSI(Module Device)
Dec 06 09:00:29 localhost kernel: ACPI: Added _OSI(Processor Device)
Dec 06 09:00:29 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 06 09:00:29 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 06 09:00:29 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 06 09:00:29 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec 06 09:00:29 localhost kernel: ACPI: Interpreter enabled
Dec 06 09:00:29 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec 06 09:00:29 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Dec 06 09:00:29 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 06 09:00:29 localhost kernel: PCI: Using E820 reservations for host bridge windows
Dec 06 09:00:29 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec 06 09:00:29 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 06 09:00:29 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [3] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [4] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [5] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [6] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [7] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [8] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [9] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [10] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [11] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [12] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [13] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [14] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [15] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [16] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [17] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [18] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [19] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [20] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [21] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [22] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [23] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [24] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [25] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [26] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [27] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [28] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [29] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [30] registered
Dec 06 09:00:29 localhost kernel: acpiphp: Slot [31] registered
Dec 06 09:00:29 localhost kernel: PCI host bridge to bus 0000:00
Dec 06 09:00:29 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 06 09:00:29 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 06 09:00:29 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 06 09:00:29 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 06 09:00:29 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec 06 09:00:29 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 06 09:00:29 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec 06 09:00:29 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec 06 09:00:29 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec 06 09:00:29 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec 06 09:00:29 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec 06 09:00:29 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec 06 09:00:29 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec 06 09:00:29 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec 06 09:00:29 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec 06 09:00:29 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec 06 09:00:29 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec 06 09:00:29 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec 06 09:00:29 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec 06 09:00:29 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec 06 09:00:29 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec 06 09:00:29 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec 06 09:00:29 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec 06 09:00:29 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec 06 09:00:29 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 06 09:00:29 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 06 09:00:29 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec 06 09:00:29 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec 06 09:00:29 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec 06 09:00:29 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec 06 09:00:29 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec 06 09:00:29 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec 06 09:00:29 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec 06 09:00:29 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec 06 09:00:29 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec 06 09:00:29 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec 06 09:00:29 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec 06 09:00:29 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec 06 09:00:29 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec 06 09:00:29 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec 06 09:00:29 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 06 09:00:29 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 06 09:00:29 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 06 09:00:29 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 06 09:00:29 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec 06 09:00:29 localhost kernel: iommu: Default domain type: Translated
Dec 06 09:00:29 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec 06 09:00:29 localhost kernel: SCSI subsystem initialized
Dec 06 09:00:29 localhost kernel: ACPI: bus type USB registered
Dec 06 09:00:29 localhost kernel: usbcore: registered new interface driver usbfs
Dec 06 09:00:29 localhost kernel: usbcore: registered new interface driver hub
Dec 06 09:00:29 localhost kernel: usbcore: registered new device driver usb
Dec 06 09:00:29 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 06 09:00:29 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 06 09:00:29 localhost kernel: PTP clock support registered
Dec 06 09:00:29 localhost kernel: EDAC MC: Ver: 3.0.0
Dec 06 09:00:29 localhost kernel: NetLabel: Initializing
Dec 06 09:00:29 localhost kernel: NetLabel:  domain hash size = 128
Dec 06 09:00:29 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 06 09:00:29 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Dec 06 09:00:29 localhost kernel: PCI: Using ACPI for IRQ routing
Dec 06 09:00:29 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Dec 06 09:00:29 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Dec 06 09:00:29 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Dec 06 09:00:29 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec 06 09:00:29 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec 06 09:00:29 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 06 09:00:29 localhost kernel: vgaarb: loaded
Dec 06 09:00:29 localhost kernel: clocksource: Switched to clocksource kvm-clock
Dec 06 09:00:29 localhost kernel: VFS: Disk quotas dquot_6.6.0
Dec 06 09:00:29 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 06 09:00:29 localhost kernel: pnp: PnP ACPI init
Dec 06 09:00:29 localhost kernel: pnp 00:03: [dma 2]
Dec 06 09:00:29 localhost kernel: pnp: PnP ACPI: found 5 devices
Dec 06 09:00:29 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 06 09:00:29 localhost kernel: NET: Registered PF_INET protocol family
Dec 06 09:00:29 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 06 09:00:29 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec 06 09:00:29 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 06 09:00:29 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec 06 09:00:29 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 06 09:00:29 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec 06 09:00:29 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec 06 09:00:29 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 06 09:00:29 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 06 09:00:29 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 06 09:00:29 localhost kernel: NET: Registered PF_XDP protocol family
Dec 06 09:00:29 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 06 09:00:29 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 06 09:00:29 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 06 09:00:29 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec 06 09:00:29 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec 06 09:00:29 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec 06 09:00:29 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec 06 09:00:29 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec 06 09:00:29 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 81127 usecs
Dec 06 09:00:29 localhost kernel: PCI: CLS 0 bytes, default 64
Dec 06 09:00:29 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 06 09:00:29 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec 06 09:00:29 localhost kernel: ACPI: bus type thunderbolt registered
Dec 06 09:00:29 localhost kernel: Trying to unpack rootfs image as initramfs...
Dec 06 09:00:29 localhost kernel: Initialise system trusted keyrings
Dec 06 09:00:29 localhost kernel: Key type blacklist registered
Dec 06 09:00:29 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec 06 09:00:29 localhost kernel: zbud: loaded
Dec 06 09:00:29 localhost kernel: integrity: Platform Keyring initialized
Dec 06 09:00:29 localhost kernel: integrity: Machine keyring initialized
Dec 06 09:00:29 localhost kernel: Freeing initrd memory: 87804K
Dec 06 09:00:29 localhost kernel: NET: Registered PF_ALG protocol family
Dec 06 09:00:29 localhost kernel: xor: automatically using best checksumming function   avx       
Dec 06 09:00:29 localhost kernel: Key type asymmetric registered
Dec 06 09:00:29 localhost kernel: Asymmetric key parser 'x509' registered
Dec 06 09:00:29 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 06 09:00:29 localhost kernel: io scheduler mq-deadline registered
Dec 06 09:00:29 localhost kernel: io scheduler kyber registered
Dec 06 09:00:29 localhost kernel: io scheduler bfq registered
Dec 06 09:00:29 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 06 09:00:29 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 06 09:00:29 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 06 09:00:29 localhost kernel: ACPI: button: Power Button [PWRF]
Dec 06 09:00:29 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec 06 09:00:29 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec 06 09:00:29 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec 06 09:00:29 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 06 09:00:29 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 06 09:00:29 localhost kernel: Non-volatile memory driver v1.3
Dec 06 09:00:29 localhost kernel: rdac: device handler registered
Dec 06 09:00:29 localhost kernel: hp_sw: device handler registered
Dec 06 09:00:29 localhost kernel: emc: device handler registered
Dec 06 09:00:29 localhost kernel: alua: device handler registered
Dec 06 09:00:29 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec 06 09:00:29 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec 06 09:00:29 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec 06 09:00:29 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec 06 09:00:29 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 06 09:00:29 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 06 09:00:29 localhost kernel: usb usb1: Product: UHCI Host Controller
Dec 06 09:00:29 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-645.el9.x86_64 uhci_hcd
Dec 06 09:00:29 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec 06 09:00:29 localhost kernel: hub 1-0:1.0: USB hub found
Dec 06 09:00:29 localhost kernel: hub 1-0:1.0: 2 ports detected
Dec 06 09:00:29 localhost kernel: usbcore: registered new interface driver usbserial_generic
Dec 06 09:00:29 localhost kernel: usbserial: USB Serial support registered for generic
Dec 06 09:00:29 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 06 09:00:29 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 06 09:00:29 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 06 09:00:29 localhost kernel: mousedev: PS/2 mouse device common for all mice
Dec 06 09:00:29 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Dec 06 09:00:29 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 06 09:00:29 localhost kernel: rtc_cmos 00:04: registered as rtc0
Dec 06 09:00:29 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-12-06T09:00:28 UTC (1765011628)
Dec 06 09:00:29 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec 06 09:00:29 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec 06 09:00:29 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 06 09:00:29 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 06 09:00:29 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 06 09:00:29 localhost kernel: usbcore: registered new interface driver usbhid
Dec 06 09:00:29 localhost kernel: usbhid: USB HID core driver
Dec 06 09:00:29 localhost kernel: drop_monitor: Initializing network drop monitor service
Dec 06 09:00:29 localhost kernel: Initializing XFRM netlink socket
Dec 06 09:00:29 localhost kernel: NET: Registered PF_INET6 protocol family
Dec 06 09:00:29 localhost kernel: Segment Routing with IPv6
Dec 06 09:00:29 localhost kernel: NET: Registered PF_PACKET protocol family
Dec 06 09:00:29 localhost kernel: mpls_gso: MPLS GSO support
Dec 06 09:00:29 localhost kernel: IPI shorthand broadcast: enabled
Dec 06 09:00:29 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Dec 06 09:00:29 localhost kernel: AES CTR mode by8 optimization enabled
Dec 06 09:00:29 localhost kernel: sched_clock: Marking stable (1213005399, 145039410)->(1440318279, -82273470)
Dec 06 09:00:29 localhost kernel: registered taskstats version 1
Dec 06 09:00:29 localhost kernel: Loading compiled-in X.509 certificates
Dec 06 09:00:29 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 06 09:00:29 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 06 09:00:29 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 06 09:00:29 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec 06 09:00:29 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec 06 09:00:29 localhost kernel: Demotion targets for Node 0: null
Dec 06 09:00:29 localhost kernel: page_owner is disabled
Dec 06 09:00:29 localhost kernel: Key type .fscrypt registered
Dec 06 09:00:29 localhost kernel: Key type fscrypt-provisioning registered
Dec 06 09:00:29 localhost kernel: Key type big_key registered
Dec 06 09:00:29 localhost kernel: Key type encrypted registered
Dec 06 09:00:29 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 06 09:00:29 localhost kernel: Loading compiled-in module X.509 certificates
Dec 06 09:00:29 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 06 09:00:29 localhost kernel: ima: Allocated hash algorithm: sha256
Dec 06 09:00:29 localhost kernel: ima: No architecture policies found
Dec 06 09:00:29 localhost kernel: evm: Initialising EVM extended attributes:
Dec 06 09:00:29 localhost kernel: evm: security.selinux
Dec 06 09:00:29 localhost kernel: evm: security.SMACK64 (disabled)
Dec 06 09:00:29 localhost kernel: evm: security.SMACK64EXEC (disabled)
Dec 06 09:00:29 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 06 09:00:29 localhost kernel: evm: security.SMACK64MMAP (disabled)
Dec 06 09:00:29 localhost kernel: evm: security.apparmor (disabled)
Dec 06 09:00:29 localhost kernel: evm: security.ima
Dec 06 09:00:29 localhost kernel: evm: security.capability
Dec 06 09:00:29 localhost kernel: evm: HMAC attrs: 0x1
Dec 06 09:00:29 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 06 09:00:29 localhost kernel: Running certificate verification RSA selftest
Dec 06 09:00:29 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 06 09:00:29 localhost kernel: Running certificate verification ECDSA selftest
Dec 06 09:00:29 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec 06 09:00:29 localhost kernel: clk: Disabling unused clocks
Dec 06 09:00:29 localhost kernel: Freeing unused decrypted memory: 2028K
Dec 06 09:00:29 localhost kernel: Freeing unused kernel image (initmem) memory: 4196K
Dec 06 09:00:29 localhost kernel: Write protecting the kernel read-only data: 30720k
Dec 06 09:00:29 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 428K
Dec 06 09:00:29 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 06 09:00:29 localhost kernel: Run /init as init process
Dec 06 09:00:29 localhost kernel:   with arguments:
Dec 06 09:00:29 localhost kernel:     /init
Dec 06 09:00:29 localhost kernel:   with environment:
Dec 06 09:00:29 localhost kernel:     HOME=/
Dec 06 09:00:29 localhost kernel:     TERM=linux
Dec 06 09:00:29 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64
Dec 06 09:00:29 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 06 09:00:29 localhost systemd[1]: Detected virtualization kvm.
Dec 06 09:00:29 localhost systemd[1]: Detected architecture x86-64.
Dec 06 09:00:29 localhost systemd[1]: Running in initrd.
Dec 06 09:00:29 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 06 09:00:29 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 06 09:00:29 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Dec 06 09:00:29 localhost kernel: usb 1-1: Manufacturer: QEMU
Dec 06 09:00:29 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec 06 09:00:29 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 06 09:00:29 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec 06 09:00:29 localhost systemd[1]: No hostname configured, using default hostname.
Dec 06 09:00:29 localhost systemd[1]: Hostname set to <localhost>.
Dec 06 09:00:29 localhost systemd[1]: Initializing machine ID from VM UUID.
Dec 06 09:00:29 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Dec 06 09:00:29 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 06 09:00:29 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 06 09:00:29 localhost systemd[1]: Reached target Initrd /usr File System.
Dec 06 09:00:29 localhost systemd[1]: Reached target Local File Systems.
Dec 06 09:00:29 localhost systemd[1]: Reached target Path Units.
Dec 06 09:00:29 localhost systemd[1]: Reached target Slice Units.
Dec 06 09:00:29 localhost systemd[1]: Reached target Swaps.
Dec 06 09:00:29 localhost systemd[1]: Reached target Timer Units.
Dec 06 09:00:29 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 06 09:00:29 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Dec 06 09:00:29 localhost systemd[1]: Listening on Journal Socket.
Dec 06 09:00:29 localhost systemd[1]: Listening on udev Control Socket.
Dec 06 09:00:29 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 06 09:00:29 localhost systemd[1]: Reached target Socket Units.
Dec 06 09:00:29 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 06 09:00:29 localhost systemd[1]: Starting Journal Service...
Dec 06 09:00:29 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 06 09:00:29 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 06 09:00:29 localhost systemd[1]: Starting Create System Users...
Dec 06 09:00:29 localhost systemd[1]: Starting Setup Virtual Console...
Dec 06 09:00:29 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 06 09:00:29 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 06 09:00:29 localhost systemd[1]: Finished Create System Users.
Dec 06 09:00:29 localhost systemd-journald[309]: Journal started
Dec 06 09:00:29 localhost systemd-journald[309]: Runtime Journal (/run/log/journal/9a5f3f62e1ed4c638d00a3c5e56bbddc) is 8.0M, max 153.6M, 145.6M free.
Dec 06 09:00:29 localhost systemd-sysusers[312]: Creating group 'users' with GID 100.
Dec 06 09:00:29 localhost systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Dec 06 09:00:29 localhost systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 06 09:00:29 localhost systemd[1]: Started Journal Service.
Dec 06 09:00:29 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 06 09:00:29 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 06 09:00:29 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 06 09:00:29 localhost systemd[1]: Finished Setup Virtual Console.
Dec 06 09:00:29 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 06 09:00:29 localhost systemd[1]: Starting dracut cmdline hook...
Dec 06 09:00:29 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 06 09:00:29 localhost dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Dec 06 09:00:30 localhost dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 06 09:00:30 localhost systemd[1]: Finished dracut cmdline hook.
Dec 06 09:00:30 localhost systemd[1]: Starting dracut pre-udev hook...
Dec 06 09:00:30 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 06 09:00:30 localhost kernel: device-mapper: uevent: version 1.0.3
Dec 06 09:00:30 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec 06 09:00:30 localhost kernel: RPC: Registered named UNIX socket transport module.
Dec 06 09:00:30 localhost kernel: RPC: Registered udp transport module.
Dec 06 09:00:30 localhost kernel: RPC: Registered tcp transport module.
Dec 06 09:00:30 localhost kernel: RPC: Registered tcp-with-tls transport module.
Dec 06 09:00:30 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 06 09:00:30 localhost rpc.statd[443]: Version 2.5.4 starting
Dec 06 09:00:30 localhost rpc.statd[443]: Initializing NSM state
Dec 06 09:00:30 localhost rpc.idmapd[448]: Setting log level to 0
Dec 06 09:00:30 localhost systemd[1]: Finished dracut pre-udev hook.
Dec 06 09:00:30 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 06 09:00:30 localhost systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Dec 06 09:00:30 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 06 09:00:30 localhost systemd[1]: Starting dracut pre-trigger hook...
Dec 06 09:00:30 localhost systemd[1]: Finished dracut pre-trigger hook.
Dec 06 09:00:30 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 06 09:00:30 localhost systemd[1]: Created slice Slice /system/modprobe.
Dec 06 09:00:30 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 06 09:00:30 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 06 09:00:30 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 06 09:00:30 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 06 09:00:30 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 06 09:00:30 localhost systemd[1]: Reached target Network.
Dec 06 09:00:30 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 06 09:00:30 localhost systemd[1]: Starting dracut initqueue hook...
Dec 06 09:00:30 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec 06 09:00:30 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec 06 09:00:30 localhost kernel:  vda: vda1
Dec 06 09:00:30 localhost systemd-udevd[501]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 09:00:30 localhost kernel: libata version 3.00 loaded.
Dec 06 09:00:30 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Dec 06 09:00:30 localhost kernel: scsi host0: ata_piix
Dec 06 09:00:30 localhost systemd[1]: Mounting Kernel Configuration File System...
Dec 06 09:00:30 localhost kernel: scsi host1: ata_piix
Dec 06 09:00:30 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec 06 09:00:30 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec 06 09:00:30 localhost systemd[1]: Mounted Kernel Configuration File System.
Dec 06 09:00:30 localhost systemd[1]: Found device /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 06 09:00:30 localhost systemd[1]: Reached target Initrd Root Device.
Dec 06 09:00:30 localhost systemd[1]: Reached target System Initialization.
Dec 06 09:00:30 localhost systemd[1]: Reached target Basic System.
Dec 06 09:00:30 localhost kernel: ata1: found unknown device (class 0)
Dec 06 09:00:30 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 06 09:00:30 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 06 09:00:30 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 06 09:00:31 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 06 09:00:31 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 06 09:00:31 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Dec 06 09:00:31 localhost systemd[1]: Finished dracut initqueue hook.
Dec 06 09:00:31 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Dec 06 09:00:31 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Dec 06 09:00:31 localhost systemd[1]: Reached target Remote File Systems.
Dec 06 09:00:31 localhost systemd[1]: Starting dracut pre-mount hook...
Dec 06 09:00:31 localhost systemd[1]: Finished dracut pre-mount hook.
Dec 06 09:00:31 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f...
Dec 06 09:00:31 localhost systemd-fsck[556]: /usr/sbin/fsck.xfs: XFS file system.
Dec 06 09:00:31 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 06 09:00:31 localhost systemd[1]: Mounting /sysroot...
Dec 06 09:00:31 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 06 09:00:31 localhost kernel: XFS (vda1): Mounting V5 Filesystem fcf6b761-831a-48a7-9f5f-068b5063763f
Dec 06 09:00:31 localhost kernel: XFS (vda1): Ending clean mount
Dec 06 09:00:31 localhost systemd[1]: Mounted /sysroot.
Dec 06 09:00:31 localhost systemd[1]: Reached target Initrd Root File System.
Dec 06 09:00:31 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 06 09:00:31 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 06 09:00:31 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 06 09:00:31 localhost systemd[1]: Reached target Initrd File Systems.
Dec 06 09:00:31 localhost systemd[1]: Reached target Initrd Default Target.
Dec 06 09:00:31 localhost systemd[1]: Starting dracut mount hook...
Dec 06 09:00:31 localhost systemd[1]: Finished dracut mount hook.
Dec 06 09:00:31 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 06 09:00:32 localhost rpc.idmapd[448]: exiting on signal 15
Dec 06 09:00:32 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 06 09:00:32 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 06 09:00:32 localhost systemd[1]: Stopped target Network.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Timer Units.
Dec 06 09:00:32 localhost systemd[1]: dbus.socket: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 06 09:00:32 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Initrd Default Target.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Basic System.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Initrd Root Device.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Initrd /usr File System.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Path Units.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Remote File Systems.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Slice Units.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Socket Units.
Dec 06 09:00:32 localhost systemd[1]: Stopped target System Initialization.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Local File Systems.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Swaps.
Dec 06 09:00:32 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped dracut mount hook.
Dec 06 09:00:32 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped dracut pre-mount hook.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Dec 06 09:00:32 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 06 09:00:32 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped dracut initqueue hook.
Dec 06 09:00:32 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped Apply Kernel Variables.
Dec 06 09:00:32 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Dec 06 09:00:32 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped Coldplug All udev Devices.
Dec 06 09:00:32 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped dracut pre-trigger hook.
Dec 06 09:00:32 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 06 09:00:32 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped Setup Virtual Console.
Dec 06 09:00:32 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 06 09:00:32 localhost systemd[1]: systemd-udevd.service: Consumed 1.082s CPU time.
Dec 06 09:00:32 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 06 09:00:32 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Closed udev Control Socket.
Dec 06 09:00:32 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Closed udev Kernel Socket.
Dec 06 09:00:32 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped dracut pre-udev hook.
Dec 06 09:00:32 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped dracut cmdline hook.
Dec 06 09:00:32 localhost systemd[1]: Starting Cleanup udev Database...
Dec 06 09:00:32 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 06 09:00:32 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Dec 06 09:00:32 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped Create System Users.
Dec 06 09:00:32 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Finished Cleanup udev Database.
Dec 06 09:00:32 localhost systemd[1]: Reached target Switch Root.
Dec 06 09:00:32 localhost systemd[1]: Starting Switch Root...
Dec 06 09:00:32 localhost systemd[1]: Switching root.
Dec 06 09:00:32 localhost systemd-journald[309]: Journal stopped
Dec 06 09:00:32 localhost systemd-journald[309]: Received SIGTERM from PID 1 (systemd).
Dec 06 09:00:32 localhost kernel: audit: type=1404 audit(1765011632.287:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 06 09:00:32 localhost kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:00:32 localhost kernel: SELinux:  policy capability open_perms=1
Dec 06 09:00:32 localhost kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:00:32 localhost kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:00:32 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:00:32 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:00:32 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:00:32 localhost kernel: audit: type=1403 audit(1765011632.420:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 06 09:00:32 localhost systemd[1]: Successfully loaded SELinux policy in 136.180ms.
Dec 06 09:00:32 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 25.632ms.
Dec 06 09:00:32 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 06 09:00:32 localhost systemd[1]: Detected virtualization kvm.
Dec 06 09:00:32 localhost systemd[1]: Detected architecture x86-64.
Dec 06 09:00:32 localhost systemd-rc-local-generator[639]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:00:32 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped Switch Root.
Dec 06 09:00:32 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 06 09:00:32 localhost systemd[1]: Created slice Slice /system/getty.
Dec 06 09:00:32 localhost systemd[1]: Created slice Slice /system/serial-getty.
Dec 06 09:00:32 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Dec 06 09:00:32 localhost systemd[1]: Created slice User and Session Slice.
Dec 06 09:00:32 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 06 09:00:32 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Dec 06 09:00:32 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 06 09:00:32 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Switch Root.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Initrd File Systems.
Dec 06 09:00:32 localhost systemd[1]: Stopped target Initrd Root File System.
Dec 06 09:00:32 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Dec 06 09:00:32 localhost systemd[1]: Reached target Path Units.
Dec 06 09:00:32 localhost systemd[1]: Reached target rpc_pipefs.target.
Dec 06 09:00:32 localhost systemd[1]: Reached target Slice Units.
Dec 06 09:00:32 localhost systemd[1]: Reached target Swaps.
Dec 06 09:00:32 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Dec 06 09:00:32 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Dec 06 09:00:32 localhost systemd[1]: Reached target RPC Port Mapper.
Dec 06 09:00:32 localhost systemd[1]: Listening on Process Core Dump Socket.
Dec 06 09:00:32 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Dec 06 09:00:32 localhost systemd[1]: Listening on udev Control Socket.
Dec 06 09:00:32 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 06 09:00:32 localhost systemd[1]: Mounting Huge Pages File System...
Dec 06 09:00:32 localhost systemd[1]: Mounting POSIX Message Queue File System...
Dec 06 09:00:32 localhost systemd[1]: Mounting Kernel Debug File System...
Dec 06 09:00:32 localhost systemd[1]: Mounting Kernel Trace File System...
Dec 06 09:00:32 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 06 09:00:32 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 06 09:00:32 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 06 09:00:32 localhost systemd[1]: Starting Load Kernel Module drm...
Dec 06 09:00:32 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Dec 06 09:00:32 localhost systemd[1]: Starting Load Kernel Module fuse...
Dec 06 09:00:32 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 06 09:00:32 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Stopped File System Check on Root Device.
Dec 06 09:00:32 localhost systemd[1]: Stopped Journal Service.
Dec 06 09:00:32 localhost kernel: fuse: init (API version 7.37)
Dec 06 09:00:32 localhost systemd[1]: Starting Journal Service...
Dec 06 09:00:32 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 06 09:00:32 localhost systemd[1]: Starting Generate network units from Kernel command line...
Dec 06 09:00:32 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 06 09:00:32 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Dec 06 09:00:32 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 06 09:00:32 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 06 09:00:32 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 06 09:00:32 localhost systemd[1]: Mounted Huge Pages File System.
Dec 06 09:00:32 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 06 09:00:32 localhost systemd[1]: Mounted POSIX Message Queue File System.
Dec 06 09:00:32 localhost systemd[1]: Mounted Kernel Debug File System.
Dec 06 09:00:32 localhost systemd[1]: Mounted Kernel Trace File System.
Dec 06 09:00:32 localhost systemd-journald[680]: Journal started
Dec 06 09:00:32 localhost systemd-journald[680]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 06 09:00:32 localhost systemd[1]: Queued start job for default target Multi-User System.
Dec 06 09:00:32 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Started Journal Service.
Dec 06 09:00:32 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 06 09:00:32 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 06 09:00:32 localhost kernel: ACPI: bus type drm_connector registered
Dec 06 09:00:32 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Dec 06 09:00:32 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Finished Load Kernel Module fuse.
Dec 06 09:00:32 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 06 09:00:32 localhost systemd[1]: Finished Load Kernel Module drm.
Dec 06 09:00:32 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 06 09:00:32 localhost systemd[1]: Finished Generate network units from Kernel command line.
Dec 06 09:00:32 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 06 09:00:32 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 06 09:00:32 localhost systemd[1]: Mounting FUSE Control File System...
Dec 06 09:00:32 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 06 09:00:32 localhost systemd[1]: Starting Rebuild Hardware Database...
Dec 06 09:00:32 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 06 09:00:32 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec 06 09:00:32 localhost systemd[1]: Starting Load/Save OS Random Seed...
Dec 06 09:00:32 localhost systemd[1]: Starting Create System Users...
Dec 06 09:00:32 localhost systemd[1]: Mounted FUSE Control File System.
Dec 06 09:00:32 localhost systemd-journald[680]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 06 09:00:32 localhost systemd-journald[680]: Received client request to flush runtime journal.
Dec 06 09:00:32 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 06 09:00:32 localhost systemd[1]: Finished Load/Save OS Random Seed.
Dec 06 09:00:32 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 06 09:00:33 localhost systemd[1]: Finished Create System Users.
Dec 06 09:00:33 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 06 09:00:33 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 06 09:00:33 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 06 09:00:33 localhost systemd[1]: Reached target Preparation for Local File Systems.
Dec 06 09:00:33 localhost systemd[1]: Reached target Local File Systems.
Dec 06 09:00:33 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 06 09:00:33 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 06 09:00:33 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 06 09:00:33 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec 06 09:00:33 localhost systemd[1]: Starting Automatic Boot Loader Update...
Dec 06 09:00:33 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 06 09:00:33 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 06 09:00:33 localhost bootctl[697]: Couldn't find EFI system partition, skipping.
Dec 06 09:00:33 localhost systemd[1]: Finished Automatic Boot Loader Update.
Dec 06 09:00:33 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 06 09:00:33 localhost systemd[1]: Starting Security Auditing Service...
Dec 06 09:00:33 localhost systemd[1]: Starting RPC Bind...
Dec 06 09:00:33 localhost systemd[1]: Starting Rebuild Journal Catalog...
Dec 06 09:00:33 localhost auditd[703]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec 06 09:00:33 localhost auditd[703]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec 06 09:00:33 localhost systemd[1]: Finished Rebuild Journal Catalog.
Dec 06 09:00:33 localhost systemd[1]: Started RPC Bind.
Dec 06 09:00:33 localhost augenrules[708]: /sbin/augenrules: No change
Dec 06 09:00:33 localhost augenrules[723]: No rules
Dec 06 09:00:33 localhost augenrules[723]: enabled 1
Dec 06 09:00:33 localhost augenrules[723]: failure 1
Dec 06 09:00:33 localhost augenrules[723]: pid 703
Dec 06 09:00:33 localhost augenrules[723]: rate_limit 0
Dec 06 09:00:33 localhost augenrules[723]: backlog_limit 8192
Dec 06 09:00:33 localhost augenrules[723]: lost 0
Dec 06 09:00:33 localhost augenrules[723]: backlog 3
Dec 06 09:00:33 localhost augenrules[723]: backlog_wait_time 60000
Dec 06 09:00:33 localhost augenrules[723]: backlog_wait_time_actual 0
Dec 06 09:00:33 localhost augenrules[723]: enabled 1
Dec 06 09:00:33 localhost augenrules[723]: failure 1
Dec 06 09:00:33 localhost augenrules[723]: pid 703
Dec 06 09:00:33 localhost augenrules[723]: rate_limit 0
Dec 06 09:00:33 localhost augenrules[723]: backlog_limit 8192
Dec 06 09:00:33 localhost augenrules[723]: lost 0
Dec 06 09:00:33 localhost augenrules[723]: backlog 2
Dec 06 09:00:33 localhost augenrules[723]: backlog_wait_time 60000
Dec 06 09:00:33 localhost augenrules[723]: backlog_wait_time_actual 0
Dec 06 09:00:33 localhost augenrules[723]: enabled 1
Dec 06 09:00:33 localhost augenrules[723]: failure 1
Dec 06 09:00:33 localhost augenrules[723]: pid 703
Dec 06 09:00:33 localhost augenrules[723]: rate_limit 0
Dec 06 09:00:33 localhost augenrules[723]: backlog_limit 8192
Dec 06 09:00:33 localhost augenrules[723]: lost 0
Dec 06 09:00:33 localhost augenrules[723]: backlog 2
Dec 06 09:00:33 localhost augenrules[723]: backlog_wait_time 60000
Dec 06 09:00:33 localhost augenrules[723]: backlog_wait_time_actual 0
Dec 06 09:00:33 localhost systemd[1]: Started Security Auditing Service.
Dec 06 09:00:33 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 06 09:00:33 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 06 09:00:33 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 06 09:00:33 localhost systemd[1]: Finished Rebuild Hardware Database.
Dec 06 09:00:33 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 06 09:00:33 localhost systemd[1]: Starting Update is Completed...
Dec 06 09:00:33 localhost systemd[1]: Finished Update is Completed.
Dec 06 09:00:33 localhost systemd-udevd[731]: Using default interface naming scheme 'rhel-9.0'.
Dec 06 09:00:33 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 06 09:00:33 localhost systemd[1]: Reached target System Initialization.
Dec 06 09:00:33 localhost systemd[1]: Started dnf makecache --timer.
Dec 06 09:00:33 localhost systemd[1]: Started Daily rotation of log files.
Dec 06 09:00:33 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 06 09:00:33 localhost systemd[1]: Reached target Timer Units.
Dec 06 09:00:33 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 06 09:00:33 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 06 09:00:33 localhost systemd[1]: Reached target Socket Units.
Dec 06 09:00:33 localhost systemd-udevd[743]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 09:00:33 localhost systemd[1]: Starting D-Bus System Message Bus...
Dec 06 09:00:33 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 06 09:00:33 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 06 09:00:33 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 06 09:00:33 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 06 09:00:33 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 06 09:00:33 localhost systemd[1]: Started D-Bus System Message Bus.
Dec 06 09:00:33 localhost systemd[1]: Reached target Basic System.
Dec 06 09:00:33 localhost dbus-broker-lau[770]: Ready
Dec 06 09:00:33 localhost systemd[1]: Starting NTP client/server...
Dec 06 09:00:33 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec 06 09:00:33 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 06 09:00:33 localhost systemd[1]: Starting IPv4 firewall with iptables...
Dec 06 09:00:33 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec 06 09:00:33 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec 06 09:00:33 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec 06 09:00:33 localhost systemd[1]: Started irqbalance daemon.
Dec 06 09:00:33 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 06 09:00:33 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:00:33 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:00:33 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:00:33 localhost systemd[1]: Reached target sshd-keygen.target.
Dec 06 09:00:33 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 06 09:00:33 localhost systemd[1]: Reached target User and Group Name Lookups.
Dec 06 09:00:33 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 06 09:00:33 localhost systemd[1]: Starting User Login Management...
Dec 06 09:00:33 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 06 09:00:33 localhost chronyd[796]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 06 09:00:33 localhost chronyd[796]: Loaded 0 symmetric keys
Dec 06 09:00:33 localhost chronyd[796]: Using right/UTC timezone to obtain leap second data
Dec 06 09:00:33 localhost chronyd[796]: Loaded seccomp filter (level 2)
Dec 06 09:00:33 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec 06 09:00:33 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec 06 09:00:33 localhost systemd[1]: Started NTP client/server.
Dec 06 09:00:33 localhost kernel: Console: switching to colour dummy device 80x25
Dec 06 09:00:33 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 06 09:00:33 localhost kernel: [drm] features: -context_init
Dec 06 09:00:33 localhost kernel: [drm] number of scanouts: 1
Dec 06 09:00:33 localhost kernel: [drm] number of cap sets: 0
Dec 06 09:00:33 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 06 09:00:33 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec 06 09:00:33 localhost systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 06 09:00:33 localhost systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 06 09:00:33 localhost systemd-logind[788]: New seat seat0.
Dec 06 09:00:33 localhost systemd[1]: Started User Login Management.
Dec 06 09:00:33 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec 06 09:00:33 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec 06 09:00:33 localhost kernel: Console: switching to colour frame buffer device 128x48
Dec 06 09:00:33 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 06 09:00:34 localhost kernel: kvm_amd: TSC scaling supported
Dec 06 09:00:34 localhost kernel: kvm_amd: Nested Virtualization enabled
Dec 06 09:00:34 localhost kernel: kvm_amd: Nested Paging enabled
Dec 06 09:00:34 localhost kernel: kvm_amd: LBR virtualization supported
Dec 06 09:00:34 localhost iptables.init[782]: iptables: Applying firewall rules: [  OK  ]
Dec 06 09:00:34 localhost systemd[1]: Finished IPv4 firewall with iptables.
Dec 06 09:00:34 localhost cloud-init[841]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sat, 06 Dec 2025 09:00:34 +0000. Up 6.99 seconds.
Dec 06 09:00:34 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Dec 06 09:00:34 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Dec 06 09:00:34 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp5zgaxgc3.mount: Deactivated successfully.
Dec 06 09:00:34 localhost systemd[1]: Starting Hostname Service...
Dec 06 09:00:34 localhost systemd[1]: Started Hostname Service.
Dec 06 09:00:34 np0005548916.novalocal systemd-hostnamed[855]: Hostname set to <np0005548916.novalocal> (static)
Dec 06 09:00:34 np0005548916.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec 06 09:00:34 np0005548916.novalocal systemd[1]: Reached target Preparation for Network.
Dec 06 09:00:34 np0005548916.novalocal systemd[1]: Starting Network Manager...
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9505] NetworkManager (version 1.54.1-1.el9) is starting... (boot:27715b31-3399-4bbf-a0fa-54836c80918e)
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9512] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9600] manager[0x559845864080]: monitoring kernel firmware directory '/lib/firmware'.
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9660] hostname: hostname: using hostnamed
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9660] hostname: static hostname changed from (none) to "np0005548916.novalocal"
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9665] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9797] manager[0x559845864080]: rfkill: Wi-Fi hardware radio set enabled
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9798] manager[0x559845864080]: rfkill: WWAN hardware radio set enabled
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9846] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9846] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9847] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9847] manager: Networking is enabled by state file
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9850] settings: Loaded settings plugin: keyfile (internal)
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9862] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9881] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9894] dhcp: init: Using DHCP client 'internal'
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9897] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9913] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 06 09:00:34 np0005548916.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9920] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9929] device (lo): Activation: starting connection 'lo' (04d45710-56f6-4696-9924-dd30b84bf74f)
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9940] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9943] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9976] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9979] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9981] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9983] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9985] device (eth0): carrier: link connected
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9987] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9993] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 06 09:00:34 np0005548916.novalocal NetworkManager[859]: <info>  [1765011634.9999] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0003] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0004] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0006] manager: NetworkManager state is now CONNECTING
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0007] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0013] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0016] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0047] dhcp4 (eth0): state changed new lease, address=38.102.83.113
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0054] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 06 09:00:35 np0005548916.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0076] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 06 09:00:35 np0005548916.novalocal systemd[1]: Started Network Manager.
Dec 06 09:00:35 np0005548916.novalocal systemd[1]: Reached target Network.
Dec 06 09:00:35 np0005548916.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 06 09:00:35 np0005548916.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 06 09:00:35 np0005548916.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0238] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0242] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0248] device (lo): Activation: successful, device activated.
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0254] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0256] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0259] manager: NetworkManager state is now CONNECTED_SITE
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0262] device (eth0): Activation: successful, device activated.
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0268] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 06 09:00:35 np0005548916.novalocal NetworkManager[859]: <info>  [1765011635.0270] manager: startup complete
Dec 06 09:00:35 np0005548916.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Dec 06 09:00:35 np0005548916.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 06 09:00:35 np0005548916.novalocal systemd[1]: Reached target NFS client services.
Dec 06 09:00:35 np0005548916.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Dec 06 09:00:35 np0005548916.novalocal systemd[1]: Reached target Remote File Systems.
Dec 06 09:00:35 np0005548916.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 06 09:00:35 np0005548916.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 06 09:00:35 np0005548916.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: Cloud-init v. 24.4-7.el9 running 'init' at Sat, 06 Dec 2025 09:00:35 +0000. Up 8.01 seconds.
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: |  eth0  | True |        38.102.83.113         | 255.255.255.0 | global | fa:16:3e:44:48:bb |
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: |  eth0  | True | fe80::f816:3eff:fe44:48bb/64 |       .       |  link  | fa:16:3e:44:48:bb |
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec 06 09:00:35 np0005548916.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 06 09:00:36 np0005548916.novalocal useradd[990]: new group: name=cloud-user, GID=1001
Dec 06 09:00:36 np0005548916.novalocal useradd[990]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Dec 06 09:00:36 np0005548916.novalocal useradd[990]: add 'cloud-user' to group 'adm'
Dec 06 09:00:36 np0005548916.novalocal useradd[990]: add 'cloud-user' to group 'systemd-journal'
Dec 06 09:00:36 np0005548916.novalocal useradd[990]: add 'cloud-user' to shadow group 'adm'
Dec 06 09:00:36 np0005548916.novalocal useradd[990]: add 'cloud-user' to shadow group 'systemd-journal'
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: Generating public/private rsa key pair.
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: The key fingerprint is:
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: SHA256:aFdqn80bOkCfNh8DURqnA70R3wlMATbYgjvI8ZQBB8w root@np0005548916.novalocal
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: The key's randomart image is:
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: +---[RSA 3072]----+
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |   oooo+.+***.   |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |    E.+ oo=B.o . |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |   . = . .=+. o  |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |    o +..oo.     |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |      ooS. o     |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |     . o..=+o    |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |         ooo+o   |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |          ...o   |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |          ...    |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: +----[SHA256]-----+
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: Generating public/private ecdsa key pair.
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: The key fingerprint is:
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: SHA256:/V9Nf6VbQ7CIW8plp6d6vVny2tXx2wtLLC0qPtvLCN0 root@np0005548916.novalocal
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: The key's randomart image is:
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: +---[ECDSA 256]---+
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |                 |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |                 |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |             .   |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |         .. . o  |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |        S..= o oo|
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |      . o *.= .oB|
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |     . . E +oB.oX|
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |      .o+ ..*oO+*|
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |      .++*+. ===o|
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: +----[SHA256]-----+
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: Generating public/private ed25519 key pair.
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: The key fingerprint is:
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: SHA256:UXMFl6H0o8BuOE4G7OkZodxjFsGSMf8vDaZdvN0lYLY root@np0005548916.novalocal
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: The key's randomart image is:
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: +--[ED25519 256]--+
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |    o+.   o +o+o |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |    o+.. o + +.  |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |     .* . o = o  |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |   . + * = + + . |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |    o O S = E . .|
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |     + @ B o . o |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |      + + + . .  |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |         .       |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: |                 |
Dec 06 09:00:36 np0005548916.novalocal cloud-init[923]: +----[SHA256]-----+
Dec 06 09:00:36 np0005548916.novalocal sm-notify[1006]: Version 2.5.4 starting
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Dec 06 09:00:36 np0005548916.novalocal sshd[1008]: Server listening on 0.0.0.0 port 22.
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Reached target Cloud-config availability.
Dec 06 09:00:36 np0005548916.novalocal sshd[1008]: Server listening on :: port 22.
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Reached target Network is Online.
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Starting Crash recovery kernel arming...
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Starting System Logging Service...
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Starting OpenSSH server daemon...
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Starting Permit User Sessions...
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Started Notify NFS peers of a restart.
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Started OpenSSH server daemon.
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Finished Permit User Sessions.
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Started Command Scheduler.
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Started Getty on tty1.
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Started Serial Getty on ttyS0.
Dec 06 09:00:36 np0005548916.novalocal crond[1011]: (CRON) STARTUP (1.5.7)
Dec 06 09:00:36 np0005548916.novalocal crond[1011]: (CRON) INFO (Syslog will be used instead of sendmail.)
Dec 06 09:00:36 np0005548916.novalocal crond[1011]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 53% if used.)
Dec 06 09:00:36 np0005548916.novalocal crond[1011]: (CRON) INFO (running with inotify support)
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Reached target Login Prompts.
Dec 06 09:00:36 np0005548916.novalocal rsyslogd[1007]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1007" x-info="https://www.rsyslog.com"] start
Dec 06 09:00:36 np0005548916.novalocal rsyslogd[1007]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Started System Logging Service.
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Reached target Multi-User System.
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 06 09:00:36 np0005548916.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 06 09:00:36 np0005548916.novalocal rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:00:36 np0005548916.novalocal kdumpctl[1016]: kdump: No kdump initial ramdisk found.
Dec 06 09:00:36 np0005548916.novalocal kdumpctl[1016]: kdump: Rebuilding /boot/initramfs-5.14.0-645.el9.x86_64kdump.img
Dec 06 09:00:36 np0005548916.novalocal sshd-session[1100]: Connection closed by 38.102.83.114 port 43468 [preauth]
Dec 06 09:00:36 np0005548916.novalocal sshd-session[1117]: Unable to negotiate with 38.102.83.114 port 45332: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Dec 06 09:00:36 np0005548916.novalocal sshd-session[1136]: Unable to negotiate with 38.102.83.114 port 45356: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Dec 06 09:00:36 np0005548916.novalocal cloud-init[1141]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sat, 06 Dec 2025 09:00:36 +0000. Up 9.58 seconds.
Dec 06 09:00:36 np0005548916.novalocal sshd-session[1143]: Unable to negotiate with 38.102.83.114 port 45364: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Dec 06 09:00:37 np0005548916.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Dec 06 09:00:37 np0005548916.novalocal sshd-session[1168]: Unable to negotiate with 38.102.83.114 port 45380: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Dec 06 09:00:37 np0005548916.novalocal sshd-session[1128]: Connection closed by 38.102.83.114 port 45346 [preauth]
Dec 06 09:00:37 np0005548916.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Dec 06 09:00:37 np0005548916.novalocal sshd-session[1176]: Unable to negotiate with 38.102.83.114 port 45382: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Dec 06 09:00:37 np0005548916.novalocal sshd-session[1150]: Connection closed by 38.102.83.114 port 45374 [preauth]
Dec 06 09:00:37 np0005548916.novalocal sshd-session[1157]: Connection closed by 38.102.83.114 port 45378 [preauth]
Dec 06 09:00:37 np0005548916.novalocal dracut[1286]: dracut-057-102.git20250818.el9
Dec 06 09:00:37 np0005548916.novalocal cloud-init[1304]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sat, 06 Dec 2025 09:00:37 +0000. Up 9.99 seconds.
Dec 06 09:00:37 np0005548916.novalocal cloud-init[1311]: #############################################################
Dec 06 09:00:37 np0005548916.novalocal cloud-init[1316]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 06 09:00:37 np0005548916.novalocal dracut[1288]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-645.el9.x86_64kdump.img 5.14.0-645.el9.x86_64
Dec 06 09:00:37 np0005548916.novalocal cloud-init[1322]: 256 SHA256:/V9Nf6VbQ7CIW8plp6d6vVny2tXx2wtLLC0qPtvLCN0 root@np0005548916.novalocal (ECDSA)
Dec 06 09:00:37 np0005548916.novalocal cloud-init[1328]: 256 SHA256:UXMFl6H0o8BuOE4G7OkZodxjFsGSMf8vDaZdvN0lYLY root@np0005548916.novalocal (ED25519)
Dec 06 09:00:37 np0005548916.novalocal cloud-init[1334]: 3072 SHA256:aFdqn80bOkCfNh8DURqnA70R3wlMATbYgjvI8ZQBB8w root@np0005548916.novalocal (RSA)
Dec 06 09:00:37 np0005548916.novalocal cloud-init[1336]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 06 09:00:37 np0005548916.novalocal cloud-init[1338]: #############################################################
Dec 06 09:00:37 np0005548916.novalocal cloud-init[1304]: Cloud-init v. 24.4-7.el9 finished at Sat, 06 Dec 2025 09:00:37 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.17 seconds
Dec 06 09:00:37 np0005548916.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Dec 06 09:00:37 np0005548916.novalocal systemd[1]: Reached target Cloud-init target.
Dec 06 09:00:37 np0005548916.novalocal dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 06 09:00:37 np0005548916.novalocal dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 06 09:00:37 np0005548916.novalocal dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 06 09:00:37 np0005548916.novalocal dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 06 09:00:37 np0005548916.novalocal dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 06 09:00:37 np0005548916.novalocal dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 06 09:00:37 np0005548916.novalocal dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: Module 'resume' will not be installed, because it's in the list to be omitted!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: memstrack is not available
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: memstrack is not available
Dec 06 09:00:38 np0005548916.novalocal dracut[1288]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 06 09:00:39 np0005548916.novalocal dracut[1288]: *** Including module: systemd ***
Dec 06 09:00:39 np0005548916.novalocal dracut[1288]: *** Including module: fips ***
Dec 06 09:00:39 np0005548916.novalocal dracut[1288]: *** Including module: systemd-initrd ***
Dec 06 09:00:39 np0005548916.novalocal dracut[1288]: *** Including module: i18n ***
Dec 06 09:00:39 np0005548916.novalocal dracut[1288]: *** Including module: drm ***
Dec 06 09:00:40 np0005548916.novalocal chronyd[796]: Selected source 174.142.148.226 (2.centos.pool.ntp.org)
Dec 06 09:00:40 np0005548916.novalocal chronyd[796]: System clock TAI offset set to 37 seconds
Dec 06 09:00:40 np0005548916.novalocal dracut[1288]: *** Including module: prefixdevname ***
Dec 06 09:00:40 np0005548916.novalocal dracut[1288]: *** Including module: kernel-modules ***
Dec 06 09:00:40 np0005548916.novalocal kernel: block vda: the capability attribute has been deprecated.
Dec 06 09:00:40 np0005548916.novalocal dracut[1288]: *** Including module: kernel-modules-extra ***
Dec 06 09:00:40 np0005548916.novalocal dracut[1288]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Dec 06 09:00:40 np0005548916.novalocal dracut[1288]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Dec 06 09:00:40 np0005548916.novalocal dracut[1288]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Dec 06 09:00:40 np0005548916.novalocal dracut[1288]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Dec 06 09:00:40 np0005548916.novalocal dracut[1288]: *** Including module: qemu ***
Dec 06 09:00:40 np0005548916.novalocal dracut[1288]: *** Including module: fstab-sys ***
Dec 06 09:00:40 np0005548916.novalocal dracut[1288]: *** Including module: rootfs-block ***
Dec 06 09:00:40 np0005548916.novalocal dracut[1288]: *** Including module: terminfo ***
Dec 06 09:00:40 np0005548916.novalocal dracut[1288]: *** Including module: udev-rules ***
Dec 06 09:00:41 np0005548916.novalocal dracut[1288]: Skipping udev rule: 91-permissions.rules
Dec 06 09:00:41 np0005548916.novalocal dracut[1288]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 06 09:00:41 np0005548916.novalocal dracut[1288]: *** Including module: virtiofs ***
Dec 06 09:00:41 np0005548916.novalocal dracut[1288]: *** Including module: dracut-systemd ***
Dec 06 09:00:41 np0005548916.novalocal dracut[1288]: *** Including module: usrmount ***
Dec 06 09:00:41 np0005548916.novalocal dracut[1288]: *** Including module: base ***
Dec 06 09:00:41 np0005548916.novalocal dracut[1288]: *** Including module: fs-lib ***
Dec 06 09:00:41 np0005548916.novalocal dracut[1288]: *** Including module: kdumpbase ***
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:   microcode_ctl module: mangling fw_dir
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: configuration "intel" is ignored
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]: *** Including module: openssl ***
Dec 06 09:00:42 np0005548916.novalocal dracut[1288]: *** Including module: shutdown ***
Dec 06 09:00:43 np0005548916.novalocal dracut[1288]: *** Including module: squash ***
Dec 06 09:00:43 np0005548916.novalocal dracut[1288]: *** Including modules done ***
Dec 06 09:00:43 np0005548916.novalocal dracut[1288]: *** Installing kernel module dependencies ***
Dec 06 09:00:43 np0005548916.novalocal dracut[1288]: *** Installing kernel module dependencies done ***
Dec 06 09:00:43 np0005548916.novalocal dracut[1288]: *** Resolving executable dependencies ***
Dec 06 09:00:44 np0005548916.novalocal irqbalance[783]: Cannot change IRQ 25 affinity: Operation not permitted
Dec 06 09:00:44 np0005548916.novalocal irqbalance[783]: IRQ 25 affinity is now unmanaged
Dec 06 09:00:44 np0005548916.novalocal irqbalance[783]: Cannot change IRQ 31 affinity: Operation not permitted
Dec 06 09:00:44 np0005548916.novalocal irqbalance[783]: IRQ 31 affinity is now unmanaged
Dec 06 09:00:44 np0005548916.novalocal irqbalance[783]: Cannot change IRQ 28 affinity: Operation not permitted
Dec 06 09:00:44 np0005548916.novalocal irqbalance[783]: IRQ 28 affinity is now unmanaged
Dec 06 09:00:44 np0005548916.novalocal irqbalance[783]: Cannot change IRQ 32 affinity: Operation not permitted
Dec 06 09:00:44 np0005548916.novalocal irqbalance[783]: IRQ 32 affinity is now unmanaged
Dec 06 09:00:44 np0005548916.novalocal irqbalance[783]: Cannot change IRQ 30 affinity: Operation not permitted
Dec 06 09:00:44 np0005548916.novalocal irqbalance[783]: IRQ 30 affinity is now unmanaged
Dec 06 09:00:44 np0005548916.novalocal irqbalance[783]: Cannot change IRQ 29 affinity: Operation not permitted
Dec 06 09:00:44 np0005548916.novalocal irqbalance[783]: IRQ 29 affinity is now unmanaged
Dec 06 09:00:45 np0005548916.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 09:00:45 np0005548916.novalocal dracut[1288]: *** Resolving executable dependencies done ***
Dec 06 09:00:45 np0005548916.novalocal dracut[1288]: *** Generating early-microcode cpio image ***
Dec 06 09:00:45 np0005548916.novalocal dracut[1288]: *** Store current command line parameters ***
Dec 06 09:00:45 np0005548916.novalocal dracut[1288]: Stored kernel commandline:
Dec 06 09:00:45 np0005548916.novalocal dracut[1288]: No dracut internal kernel commandline stored in the initramfs
Dec 06 09:00:45 np0005548916.novalocal dracut[1288]: *** Install squash loader ***
Dec 06 09:00:46 np0005548916.novalocal dracut[1288]: *** Squashing the files inside the initramfs ***
Dec 06 09:00:47 np0005548916.novalocal dracut[1288]: *** Squashing the files inside the initramfs done ***
Dec 06 09:00:47 np0005548916.novalocal dracut[1288]: *** Creating image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' ***
Dec 06 09:00:47 np0005548916.novalocal dracut[1288]: *** Hardlinking files ***
Dec 06 09:00:47 np0005548916.novalocal dracut[1288]: Mode:           real
Dec 06 09:00:47 np0005548916.novalocal dracut[1288]: Files:          50
Dec 06 09:00:47 np0005548916.novalocal dracut[1288]: Linked:         0 files
Dec 06 09:00:47 np0005548916.novalocal dracut[1288]: Compared:       0 xattrs
Dec 06 09:00:47 np0005548916.novalocal dracut[1288]: Compared:       0 files
Dec 06 09:00:47 np0005548916.novalocal dracut[1288]: Saved:          0 B
Dec 06 09:00:47 np0005548916.novalocal dracut[1288]: Duration:       0.000933 seconds
Dec 06 09:00:47 np0005548916.novalocal dracut[1288]: *** Hardlinking files done ***
Dec 06 09:00:48 np0005548916.novalocal dracut[1288]: *** Creating initramfs image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' done ***
Dec 06 09:00:48 np0005548916.novalocal kdumpctl[1016]: kdump: kexec: loaded kdump kernel
Dec 06 09:00:48 np0005548916.novalocal kdumpctl[1016]: kdump: Starting kdump: [OK]
Dec 06 09:00:48 np0005548916.novalocal systemd[1]: Finished Crash recovery kernel arming.
Dec 06 09:00:48 np0005548916.novalocal systemd[1]: Startup finished in 1.552s (kernel) + 3.408s (initrd) + 16.454s (userspace) = 21.415s.
Dec 06 09:00:54 np0005548916.novalocal sshd-session[4299]: Accepted publickey for zuul from 38.102.83.114 port 60520 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Dec 06 09:00:54 np0005548916.novalocal systemd[1]: Created slice User Slice of UID 1000.
Dec 06 09:00:54 np0005548916.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 06 09:00:54 np0005548916.novalocal systemd-logind[788]: New session 1 of user zuul.
Dec 06 09:00:54 np0005548916.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 06 09:00:54 np0005548916.novalocal systemd[1]: Starting User Manager for UID 1000...
Dec 06 09:00:54 np0005548916.novalocal systemd[4303]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:00:54 np0005548916.novalocal systemd[4303]: Queued start job for default target Main User Target.
Dec 06 09:00:54 np0005548916.novalocal systemd[4303]: Created slice User Application Slice.
Dec 06 09:00:54 np0005548916.novalocal systemd[4303]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 09:00:54 np0005548916.novalocal systemd[4303]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 09:00:54 np0005548916.novalocal systemd[4303]: Reached target Paths.
Dec 06 09:00:54 np0005548916.novalocal systemd[4303]: Reached target Timers.
Dec 06 09:00:54 np0005548916.novalocal systemd[4303]: Starting D-Bus User Message Bus Socket...
Dec 06 09:00:54 np0005548916.novalocal systemd[4303]: Starting Create User's Volatile Files and Directories...
Dec 06 09:00:54 np0005548916.novalocal systemd[4303]: Listening on D-Bus User Message Bus Socket.
Dec 06 09:00:54 np0005548916.novalocal systemd[4303]: Reached target Sockets.
Dec 06 09:00:54 np0005548916.novalocal systemd[4303]: Finished Create User's Volatile Files and Directories.
Dec 06 09:00:54 np0005548916.novalocal systemd[4303]: Reached target Basic System.
Dec 06 09:00:54 np0005548916.novalocal systemd[4303]: Reached target Main User Target.
Dec 06 09:00:54 np0005548916.novalocal systemd[4303]: Startup finished in 163ms.
Dec 06 09:00:54 np0005548916.novalocal systemd[1]: Started User Manager for UID 1000.
Dec 06 09:00:54 np0005548916.novalocal systemd[1]: Started Session 1 of User zuul.
Dec 06 09:00:54 np0005548916.novalocal sshd-session[4299]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:00:55 np0005548916.novalocal python3[4385]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:00:58 np0005548916.novalocal python3[4413]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:01:01 np0005548916.novalocal CROND[4449]: (root) CMD (run-parts /etc/cron.hourly)
Dec 06 09:01:01 np0005548916.novalocal run-parts[4452]: (/etc/cron.hourly) starting 0anacron
Dec 06 09:01:01 np0005548916.novalocal anacron[4460]: Anacron started on 2025-12-06
Dec 06 09:01:01 np0005548916.novalocal anacron[4460]: Will run job `cron.daily' in 20 min.
Dec 06 09:01:01 np0005548916.novalocal anacron[4460]: Will run job `cron.weekly' in 40 min.
Dec 06 09:01:01 np0005548916.novalocal anacron[4460]: Will run job `cron.monthly' in 60 min.
Dec 06 09:01:01 np0005548916.novalocal anacron[4460]: Jobs will be executed sequentially
Dec 06 09:01:01 np0005548916.novalocal run-parts[4462]: (/etc/cron.hourly) finished 0anacron
Dec 06 09:01:01 np0005548916.novalocal CROND[4448]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 06 09:01:04 np0005548916.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 06 09:01:05 np0005548916.novalocal python3[4488]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:01:06 np0005548916.novalocal python3[4528]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 06 09:01:08 np0005548916.novalocal python3[4554]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDU0JPqo3RlcbkISWeWyZyh8N1DipPCXKbgbj83sLrBXd5pRLoLdbqBjiuLvFfP7lb5gET6+eP3VZiOMI6UHmEm8ynKQRTIQ7lxC6wlJ/5bEkQ7shEony5Dt8S+/YriKnW8SR/bfYJwGVDGiYwX9+YLTEkgtaWYCW5aOhF1JYR2fNVZQyTaBuiZFc/j1+ce31wCfSAIAFETx4TP71KVZET/mDhOPfYQSE6dNJCcZnohKVSa1SHNL0bVxbehOrQrmqmiRc81piGO4LAMvuSM3op7QTjc7lDDNoYX/DWm/O6Yd8IV5PAI5jAYm4zViXyj8K/iPfclSAUCutpd/HwsQjjiI9Ei0ObVrpLhV3PWw6UkMmfRl4sN90Bhg/95I6taoeEDSSNojukndyGr3lxM1SkEHO0ZamuvQmAOsP05x89hsZFP9E+RntviBPqrCNyyiE7JEy2H1WfIK5i0KA/BC8M+osytKOc1zBu/jI4TYPr32yUNd7mIBDzpNaUok32L4Pk= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:09 np0005548916.novalocal python3[4578]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:09 np0005548916.novalocal python3[4677]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:01:10 np0005548916.novalocal python3[4748]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765011669.4133372-252-170856646153745/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=66d341c321a043af9793d30ca9726f09_id_rsa follow=False checksum=1c48fa8bdbec038bf9f0f4b497dca115d790ad66 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:10 np0005548916.novalocal python3[4871]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:01:11 np0005548916.novalocal python3[4942]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765011670.3117692-307-246098217896273/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=66d341c321a043af9793d30ca9726f09_id_rsa.pub follow=False checksum=e7cbe2647d02b25f8aa52dd3d3a0ea1aa1cad833 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:12 np0005548916.novalocal python3[4990]: ansible-ping Invoked with data=pong
Dec 06 09:01:13 np0005548916.novalocal python3[5014]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:01:15 np0005548916.novalocal python3[5072]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 06 09:01:16 np0005548916.novalocal python3[5104]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:16 np0005548916.novalocal python3[5128]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:17 np0005548916.novalocal python3[5152]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:17 np0005548916.novalocal python3[5176]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:17 np0005548916.novalocal python3[5200]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:18 np0005548916.novalocal python3[5224]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:19 np0005548916.novalocal sudo[5248]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyrcyhxoxjhoisxbsnztfpxnxvwdknvz ; /usr/bin/python3'
Dec 06 09:01:19 np0005548916.novalocal sudo[5248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:01:19 np0005548916.novalocal python3[5250]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:19 np0005548916.novalocal sudo[5248]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:20 np0005548916.novalocal sudo[5326]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzvfcidfbrhtmiwmnnbxpbdiwowwxrds ; /usr/bin/python3'
Dec 06 09:01:20 np0005548916.novalocal sudo[5326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:01:20 np0005548916.novalocal python3[5328]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:01:20 np0005548916.novalocal sudo[5326]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:20 np0005548916.novalocal sudo[5399]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihffdvpjtfdlrhtwgkujohnxadvvvvjj ; /usr/bin/python3'
Dec 06 09:01:20 np0005548916.novalocal sudo[5399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:01:20 np0005548916.novalocal python3[5401]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765011680.0756123-32-186420266149685/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:20 np0005548916.novalocal sudo[5399]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:21 np0005548916.novalocal python3[5449]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:21 np0005548916.novalocal python3[5473]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:22 np0005548916.novalocal python3[5497]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:22 np0005548916.novalocal python3[5521]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:22 np0005548916.novalocal python3[5545]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:23 np0005548916.novalocal python3[5569]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:23 np0005548916.novalocal python3[5593]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:23 np0005548916.novalocal python3[5617]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:23 np0005548916.novalocal python3[5641]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:24 np0005548916.novalocal python3[5665]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:24 np0005548916.novalocal python3[5689]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:24 np0005548916.novalocal python3[5713]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:25 np0005548916.novalocal python3[5737]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:25 np0005548916.novalocal python3[5761]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:25 np0005548916.novalocal python3[5785]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:25 np0005548916.novalocal python3[5809]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:26 np0005548916.novalocal python3[5833]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:26 np0005548916.novalocal python3[5857]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:26 np0005548916.novalocal python3[5881]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:27 np0005548916.novalocal python3[5905]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:27 np0005548916.novalocal python3[5929]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:27 np0005548916.novalocal python3[5953]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:28 np0005548916.novalocal python3[5977]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:28 np0005548916.novalocal python3[6001]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:28 np0005548916.novalocal python3[6025]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:28 np0005548916.novalocal python3[6049]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:01:31 np0005548916.novalocal sudo[6073]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-waiqurneyzjeegsdovcuizcdzodhemuk ; /usr/bin/python3'
Dec 06 09:01:31 np0005548916.novalocal sudo[6073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:01:31 np0005548916.novalocal python3[6075]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 06 09:01:31 np0005548916.novalocal systemd[1]: Starting Time & Date Service...
Dec 06 09:01:31 np0005548916.novalocal systemd[1]: Started Time & Date Service.
Dec 06 09:01:31 np0005548916.novalocal systemd-timedated[6077]: Changed time zone to 'UTC' (UTC).
Dec 06 09:01:31 np0005548916.novalocal sudo[6073]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:32 np0005548916.novalocal sudo[6104]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzrpnywcjcwhqeucteizipchalywqyul ; /usr/bin/python3'
Dec 06 09:01:32 np0005548916.novalocal sudo[6104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:01:32 np0005548916.novalocal python3[6106]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:32 np0005548916.novalocal sudo[6104]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:32 np0005548916.novalocal python3[6182]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:01:33 np0005548916.novalocal python3[6253]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1765011692.431627-252-264182487388174/source _original_basename=tmp4gtehc5d follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:33 np0005548916.novalocal python3[6353]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:01:34 np0005548916.novalocal python3[6424]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765011693.3420992-303-216107513659361/source _original_basename=tmplwio4_od follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:34 np0005548916.novalocal sudo[6524]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmievhurerxdwhodkbycgvtcoaxhomfi ; /usr/bin/python3'
Dec 06 09:01:34 np0005548916.novalocal sudo[6524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:01:35 np0005548916.novalocal python3[6526]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:01:35 np0005548916.novalocal sudo[6524]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:35 np0005548916.novalocal sudo[6597]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sotaiqtywraegdnrxdgfqjgldidxhedq ; /usr/bin/python3'
Dec 06 09:01:35 np0005548916.novalocal sudo[6597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:01:35 np0005548916.novalocal python3[6599]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765011694.6634886-383-9733592598989/source _original_basename=tmpml1ps5qd follow=False checksum=0200c222fd008cff1969c6c814381aad26405e22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:35 np0005548916.novalocal sudo[6597]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:35 np0005548916.novalocal python3[6647]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:01:36 np0005548916.novalocal python3[6673]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:01:36 np0005548916.novalocal sudo[6751]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teyulnjxhxffwtduavlqfjeacyrphsza ; /usr/bin/python3'
Dec 06 09:01:36 np0005548916.novalocal sudo[6751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:01:36 np0005548916.novalocal python3[6753]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:01:36 np0005548916.novalocal sudo[6751]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:36 np0005548916.novalocal sudo[6824]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvubtrkrorycxoyqjcbylryqjbvoxmry ; /usr/bin/python3'
Dec 06 09:01:36 np0005548916.novalocal sudo[6824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:01:37 np0005548916.novalocal python3[6826]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1765011696.3969817-453-219858681411835/source _original_basename=tmpqw1g566e follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:37 np0005548916.novalocal sudo[6824]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:37 np0005548916.novalocal sudo[6875]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-veivtfhormlkiuyekcrvlwyuxebivhbz ; /usr/bin/python3'
Dec 06 09:01:37 np0005548916.novalocal sudo[6875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:01:37 np0005548916.novalocal python3[6877]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-c2c1-5ee8-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:01:37 np0005548916.novalocal sudo[6875]: pam_unix(sudo:session): session closed for user root
Dec 06 09:01:38 np0005548916.novalocal python3[6905]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-c2c1-5ee8-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 06 09:01:39 np0005548916.novalocal python3[6933]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:58 np0005548916.novalocal sudo[6957]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjolxhxbmkppvqstxbpsbfyjiickfqqt ; /usr/bin/python3'
Dec 06 09:01:58 np0005548916.novalocal sudo[6957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:01:59 np0005548916.novalocal python3[6959]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:01:59 np0005548916.novalocal sudo[6957]: pam_unix(sudo:session): session closed for user root
Dec 06 09:02:01 np0005548916.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 06 09:02:59 np0005548916.novalocal sshd-session[4312]: Received disconnect from 38.102.83.114 port 60520:11: disconnected by user
Dec 06 09:02:59 np0005548916.novalocal sshd-session[4312]: Disconnected from user zuul 38.102.83.114 port 60520
Dec 06 09:02:59 np0005548916.novalocal sshd-session[4299]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:02:59 np0005548916.novalocal systemd-logind[788]: Session 1 logged out. Waiting for processes to exit.
Dec 06 09:03:03 np0005548916.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 06 09:03:03 np0005548916.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec 06 09:03:03 np0005548916.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec 06 09:03:03 np0005548916.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec 06 09:03:03 np0005548916.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec 06 09:03:03 np0005548916.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec 06 09:03:03 np0005548916.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec 06 09:03:03 np0005548916.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec 06 09:03:03 np0005548916.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec 06 09:03:03 np0005548916.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec 06 09:03:03 np0005548916.novalocal NetworkManager[859]: <info>  [1765011783.7222] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 06 09:03:03 np0005548916.novalocal systemd-udevd[6963]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 09:03:03 np0005548916.novalocal NetworkManager[859]: <info>  [1765011783.7504] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 06 09:03:03 np0005548916.novalocal NetworkManager[859]: <info>  [1765011783.7533] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 06 09:03:03 np0005548916.novalocal NetworkManager[859]: <info>  [1765011783.7537] device (eth1): carrier: link connected
Dec 06 09:03:03 np0005548916.novalocal NetworkManager[859]: <info>  [1765011783.7539] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 06 09:03:03 np0005548916.novalocal NetworkManager[859]: <info>  [1765011783.7544] policy: auto-activating connection 'Wired connection 1' (d0a7d597-e5ec-3c93-9ea9-45506a05a0f2)
Dec 06 09:03:03 np0005548916.novalocal NetworkManager[859]: <info>  [1765011783.7547] device (eth1): Activation: starting connection 'Wired connection 1' (d0a7d597-e5ec-3c93-9ea9-45506a05a0f2)
Dec 06 09:03:03 np0005548916.novalocal NetworkManager[859]: <info>  [1765011783.7548] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:03:03 np0005548916.novalocal NetworkManager[859]: <info>  [1765011783.7551] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:03:03 np0005548916.novalocal NetworkManager[859]: <info>  [1765011783.7554] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:03:03 np0005548916.novalocal NetworkManager[859]: <info>  [1765011783.7558] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 06 09:03:03 np0005548916.novalocal systemd[4303]: Starting Mark boot as successful...
Dec 06 09:03:03 np0005548916.novalocal systemd[4303]: Finished Mark boot as successful.
Dec 06 09:03:04 np0005548916.novalocal sshd-session[6967]: Accepted publickey for zuul from 38.102.83.114 port 41778 ssh2: RSA SHA256:spwPcL19sPHC+yJA+ECEA4UNmpshOiR8KfgtTbViJeA
Dec 06 09:03:04 np0005548916.novalocal systemd-logind[788]: New session 3 of user zuul.
Dec 06 09:03:04 np0005548916.novalocal systemd[1]: Started Session 3 of User zuul.
Dec 06 09:03:04 np0005548916.novalocal sshd-session[6967]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:03:04 np0005548916.novalocal python3[6994]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-5a9f-9569-00000000018f-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:03:14 np0005548916.novalocal sudo[7072]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lufczlikeoiztawyqeedsshlnqltaauk ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 06 09:03:14 np0005548916.novalocal sudo[7072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:03:15 np0005548916.novalocal python3[7074]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:03:15 np0005548916.novalocal sudo[7072]: pam_unix(sudo:session): session closed for user root
Dec 06 09:03:15 np0005548916.novalocal sudo[7145]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsfkxsvzuwalonkjqliosumpjncskoww ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 06 09:03:15 np0005548916.novalocal sudo[7145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:03:15 np0005548916.novalocal python3[7147]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765011794.7520046-155-265520938592310/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=830b9277befaf6f767205b89543169fefeef2ac1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:03:15 np0005548916.novalocal sudo[7145]: pam_unix(sudo:session): session closed for user root
Dec 06 09:03:15 np0005548916.novalocal sudo[7195]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvaptexaifstgyguqkjcelcnvzqqmxlx ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 06 09:03:15 np0005548916.novalocal sudo[7195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:03:16 np0005548916.novalocal python3[7197]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:03:16 np0005548916.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 06 09:03:16 np0005548916.novalocal systemd[1]: Stopped Network Manager Wait Online.
Dec 06 09:03:16 np0005548916.novalocal systemd[1]: Stopping Network Manager Wait Online...
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[859]: <info>  [1765011796.0699] caught SIGTERM, shutting down normally.
Dec 06 09:03:16 np0005548916.novalocal systemd[1]: Stopping Network Manager...
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[859]: <info>  [1765011796.0713] dhcp4 (eth0): canceled DHCP transaction
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[859]: <info>  [1765011796.0713] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[859]: <info>  [1765011796.0714] dhcp4 (eth0): state changed no lease
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[859]: <info>  [1765011796.0719] manager: NetworkManager state is now CONNECTING
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[859]: <info>  [1765011796.0812] dhcp4 (eth1): canceled DHCP transaction
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[859]: <info>  [1765011796.0812] dhcp4 (eth1): state changed no lease
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[859]: <info>  [1765011796.0885] exiting (success)
Dec 06 09:03:16 np0005548916.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 09:03:16 np0005548916.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 09:03:16 np0005548916.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 06 09:03:16 np0005548916.novalocal systemd[1]: Stopped Network Manager.
Dec 06 09:03:16 np0005548916.novalocal systemd[1]: NetworkManager.service: Consumed 1.088s CPU time, 10.0M memory peak.
Dec 06 09:03:16 np0005548916.novalocal systemd[1]: Starting Network Manager...
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.1555] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:27715b31-3399-4bbf-a0fa-54836c80918e)
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.1558] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.1620] manager[0x565155928070]: monitoring kernel firmware directory '/lib/firmware'.
Dec 06 09:03:16 np0005548916.novalocal systemd[1]: Starting Hostname Service...
Dec 06 09:03:16 np0005548916.novalocal systemd[1]: Started Hostname Service.
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2678] hostname: hostname: using hostnamed
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2678] hostname: static hostname changed from (none) to "np0005548916.novalocal"
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2685] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2692] manager[0x565155928070]: rfkill: Wi-Fi hardware radio set enabled
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2692] manager[0x565155928070]: rfkill: WWAN hardware radio set enabled
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2730] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2730] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2731] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2732] manager: Networking is enabled by state file
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2735] settings: Loaded settings plugin: keyfile (internal)
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2740] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2774] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2786] dhcp: init: Using DHCP client 'internal'
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2790] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2796] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2803] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2812] device (lo): Activation: starting connection 'lo' (04d45710-56f6-4696-9924-dd30b84bf74f)
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2820] device (eth0): carrier: link connected
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2825] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2830] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2830] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2837] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2843] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2850] device (eth1): carrier: link connected
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2855] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2860] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (d0a7d597-e5ec-3c93-9ea9-45506a05a0f2) (indicated)
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2861] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2868] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2875] device (eth1): Activation: starting connection 'Wired connection 1' (d0a7d597-e5ec-3c93-9ea9-45506a05a0f2)
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2882] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 06 09:03:16 np0005548916.novalocal systemd[1]: Started Network Manager.
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2885] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2887] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2888] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2891] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2894] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2896] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2899] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2904] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2913] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2924] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2934] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2936] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2953] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2959] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2964] device (lo): Activation: successful, device activated.
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2971] dhcp4 (eth0): state changed new lease, address=38.102.83.113
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.2977] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 06 09:03:16 np0005548916.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 06 09:03:16 np0005548916.novalocal sudo[7195]: pam_unix(sudo:session): session closed for user root
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.3157] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.3221] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.3223] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.3227] manager: NetworkManager state is now CONNECTED_SITE
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.3232] device (eth0): Activation: successful, device activated.
Dec 06 09:03:16 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011796.3238] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 06 09:03:16 np0005548916.novalocal python3[7281]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-5a9f-9569-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:03:26 np0005548916.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 09:03:46 np0005548916.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 06 09:03:59 np0005548916.novalocal sshd-session[7286]: Connection closed by authenticating user root 87.120.191.21 port 48304 [preauth]
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3250] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 06 09:04:01 np0005548916.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 09:04:01 np0005548916.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3624] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3629] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3641] device (eth1): Activation: successful, device activated.
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3649] manager: startup complete
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3651] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <warn>  [1765011841.3663] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3674] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec 06 09:04:01 np0005548916.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3792] dhcp4 (eth1): canceled DHCP transaction
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3792] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3792] dhcp4 (eth1): state changed no lease
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3813] policy: auto-activating connection 'ci-private-network' (f3fb407f-d9e1-5507-a7f7-856240ad9666)
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3818] device (eth1): Activation: starting connection 'ci-private-network' (f3fb407f-d9e1-5507-a7f7-856240ad9666)
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3820] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3825] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3833] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3845] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3888] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3892] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 06 09:04:01 np0005548916.novalocal NetworkManager[7209]: <info>  [1765011841.3904] device (eth1): Activation: successful, device activated.
Dec 06 09:04:11 np0005548916.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 09:04:16 np0005548916.novalocal sshd-session[6970]: Received disconnect from 38.102.83.114 port 41778:11: disconnected by user
Dec 06 09:04:16 np0005548916.novalocal sshd-session[6970]: Disconnected from user zuul 38.102.83.114 port 41778
Dec 06 09:04:16 np0005548916.novalocal sshd-session[6967]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:04:16 np0005548916.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Dec 06 09:04:16 np0005548916.novalocal systemd[1]: session-3.scope: Consumed 1.722s CPU time.
Dec 06 09:04:16 np0005548916.novalocal systemd-logind[788]: Session 3 logged out. Waiting for processes to exit.
Dec 06 09:04:16 np0005548916.novalocal systemd-logind[788]: Removed session 3.
Dec 06 09:04:51 np0005548916.novalocal sshd-session[7311]: Accepted publickey for zuul from 38.102.83.114 port 43980 ssh2: RSA SHA256:spwPcL19sPHC+yJA+ECEA4UNmpshOiR8KfgtTbViJeA
Dec 06 09:04:51 np0005548916.novalocal systemd-logind[788]: New session 4 of user zuul.
Dec 06 09:04:51 np0005548916.novalocal systemd[1]: Started Session 4 of User zuul.
Dec 06 09:04:51 np0005548916.novalocal sshd-session[7311]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:04:52 np0005548916.novalocal sudo[7390]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsdkglgqqpmltrbehoupdgyzwxxfjstv ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 06 09:04:52 np0005548916.novalocal sudo[7390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:04:52 np0005548916.novalocal python3[7392]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:04:52 np0005548916.novalocal sudo[7390]: pam_unix(sudo:session): session closed for user root
Dec 06 09:04:52 np0005548916.novalocal sudo[7463]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vobeevtheytihjsymyfeumbshgrsxyco ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 06 09:04:52 np0005548916.novalocal sudo[7463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:04:52 np0005548916.novalocal python3[7465]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765011892.0843112-373-172733315032397/source _original_basename=tmpce7o65fw follow=False checksum=81d87914000d1f03e4ba3a0a6e4eda468c65f433 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:04:52 np0005548916.novalocal sudo[7463]: pam_unix(sudo:session): session closed for user root
Dec 06 09:04:55 np0005548916.novalocal sshd-session[7314]: Connection closed by 38.102.83.114 port 43980
Dec 06 09:04:55 np0005548916.novalocal sshd-session[7311]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:04:55 np0005548916.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Dec 06 09:04:55 np0005548916.novalocal systemd-logind[788]: Session 4 logged out. Waiting for processes to exit.
Dec 06 09:04:55 np0005548916.novalocal systemd-logind[788]: Removed session 4.
Dec 06 09:06:48 np0005548916.novalocal systemd[4303]: Created slice User Background Tasks Slice.
Dec 06 09:06:48 np0005548916.novalocal systemd[4303]: Starting Cleanup of User's Temporary Files and Directories...
Dec 06 09:06:48 np0005548916.novalocal systemd[4303]: Finished Cleanup of User's Temporary Files and Directories.
Dec 06 09:10:14 np0005548916.novalocal sshd-session[7496]: error: kex_exchange_identification: read: Connection reset by peer
Dec 06 09:10:14 np0005548916.novalocal sshd-session[7496]: Connection reset by 45.140.17.97 port 55177
Dec 06 09:10:22 np0005548916.novalocal sshd-session[7497]: Received disconnect from 80.94.93.233 port 12558:11:  [preauth]
Dec 06 09:10:22 np0005548916.novalocal sshd-session[7497]: Disconnected from authenticating user root 80.94.93.233 port 12558 [preauth]
Dec 06 09:10:24 np0005548916.novalocal sshd-session[7500]: Accepted publickey for zuul from 38.102.83.114 port 46360 ssh2: RSA SHA256:spwPcL19sPHC+yJA+ECEA4UNmpshOiR8KfgtTbViJeA
Dec 06 09:10:24 np0005548916.novalocal systemd-logind[788]: New session 5 of user zuul.
Dec 06 09:10:24 np0005548916.novalocal systemd[1]: Started Session 5 of User zuul.
Dec 06 09:10:24 np0005548916.novalocal sshd-session[7500]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:10:24 np0005548916.novalocal sudo[7527]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ushpbxaxgoownnohnwinmfygkvidjtri ; /usr/bin/python3'
Dec 06 09:10:24 np0005548916.novalocal sudo[7527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:10:24 np0005548916.novalocal python3[7529]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-6aeb-b52e-000000001cd4-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:10:24 np0005548916.novalocal sudo[7527]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:25 np0005548916.novalocal sudo[7555]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgdymllpogajkieqycefwawhgfcnmzow ; /usr/bin/python3'
Dec 06 09:10:25 np0005548916.novalocal sudo[7555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:10:25 np0005548916.novalocal python3[7557]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:10:25 np0005548916.novalocal sudo[7555]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:25 np0005548916.novalocal sudo[7582]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfdjnxziskloswttiujckjvqxnwsugjk ; /usr/bin/python3'
Dec 06 09:10:25 np0005548916.novalocal sudo[7582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:10:25 np0005548916.novalocal python3[7584]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:10:25 np0005548916.novalocal sudo[7582]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:25 np0005548916.novalocal sudo[7608]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-firerlxxlsyiifoltvohaaetlylynigc ; /usr/bin/python3'
Dec 06 09:10:25 np0005548916.novalocal sudo[7608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:10:25 np0005548916.novalocal python3[7610]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:10:25 np0005548916.novalocal sudo[7608]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:25 np0005548916.novalocal sudo[7634]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehcsatjgdtxhojpexcplopcyiswlorof ; /usr/bin/python3'
Dec 06 09:10:25 np0005548916.novalocal sudo[7634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:10:26 np0005548916.novalocal python3[7636]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:10:26 np0005548916.novalocal sudo[7634]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:26 np0005548916.novalocal sudo[7660]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmnkaedaayutbawyiowigykgrcevoplz ; /usr/bin/python3'
Dec 06 09:10:26 np0005548916.novalocal sudo[7660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:10:26 np0005548916.novalocal python3[7662]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:10:26 np0005548916.novalocal sudo[7660]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:27 np0005548916.novalocal sudo[7738]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfgsksrfjujgocrxrdddvccyrgsuozsd ; /usr/bin/python3'
Dec 06 09:10:27 np0005548916.novalocal sudo[7738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:10:27 np0005548916.novalocal python3[7740]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:10:27 np0005548916.novalocal sudo[7738]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:27 np0005548916.novalocal sudo[7811]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlokulekbjtrohcrblbaewllawqybpem ; /usr/bin/python3'
Dec 06 09:10:27 np0005548916.novalocal sudo[7811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:10:27 np0005548916.novalocal python3[7813]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765012227.1862109-517-246331874625000/source _original_basename=tmpujynt52a follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:10:27 np0005548916.novalocal sudo[7811]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:28 np0005548916.novalocal sudo[7861]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iggvckdifqftzuzvaxroakwgoritiuwh ; /usr/bin/python3'
Dec 06 09:10:28 np0005548916.novalocal sudo[7861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:10:28 np0005548916.novalocal python3[7863]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:10:29 np0005548916.novalocal systemd[1]: Reloading.
Dec 06 09:10:29 np0005548916.novalocal systemd-rc-local-generator[7881]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:10:29 np0005548916.novalocal sudo[7861]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:30 np0005548916.novalocal sudo[7917]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmvzizdctfkesrfcbjlbbnqkrghshwjw ; /usr/bin/python3'
Dec 06 09:10:30 np0005548916.novalocal sudo[7917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:10:31 np0005548916.novalocal python3[7919]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 06 09:10:31 np0005548916.novalocal sudo[7917]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:31 np0005548916.novalocal sudo[7943]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjbnkzfrxikcddmurwmrczliuxddxosh ; /usr/bin/python3'
Dec 06 09:10:31 np0005548916.novalocal sudo[7943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:10:31 np0005548916.novalocal python3[7945]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:10:31 np0005548916.novalocal sudo[7943]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:31 np0005548916.novalocal sudo[7971]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwlzddaszstpmimnqkoytkofxddywqsb ; /usr/bin/python3'
Dec 06 09:10:31 np0005548916.novalocal sudo[7971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:10:31 np0005548916.novalocal python3[7973]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:10:31 np0005548916.novalocal sudo[7971]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:31 np0005548916.novalocal sudo[7999]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kllyczntliqvgyyxrnwxcyyyfglmcesd ; /usr/bin/python3'
Dec 06 09:10:31 np0005548916.novalocal sudo[7999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:10:31 np0005548916.novalocal python3[8001]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:10:32 np0005548916.novalocal sudo[7999]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:32 np0005548916.novalocal sudo[8027]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lttxailvnefbwhndvsgyrcsmgfdscfwa ; /usr/bin/python3'
Dec 06 09:10:32 np0005548916.novalocal sudo[8027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:10:32 np0005548916.novalocal python3[8029]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:10:32 np0005548916.novalocal sudo[8027]: pam_unix(sudo:session): session closed for user root
Dec 06 09:10:33 np0005548916.novalocal python3[8056]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-6aeb-b52e-000000001cdb-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:10:33 np0005548916.novalocal python3[8086]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 09:10:36 np0005548916.novalocal sshd-session[7503]: Connection closed by 38.102.83.114 port 46360
Dec 06 09:10:36 np0005548916.novalocal sshd-session[7500]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:10:36 np0005548916.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Dec 06 09:10:36 np0005548916.novalocal systemd[1]: session-5.scope: Consumed 4.261s CPU time.
Dec 06 09:10:36 np0005548916.novalocal systemd-logind[788]: Session 5 logged out. Waiting for processes to exit.
Dec 06 09:10:36 np0005548916.novalocal systemd-logind[788]: Removed session 5.
Dec 06 09:10:38 np0005548916.novalocal sshd-session[8090]: Accepted publickey for zuul from 38.102.83.114 port 36360 ssh2: RSA SHA256:spwPcL19sPHC+yJA+ECEA4UNmpshOiR8KfgtTbViJeA
Dec 06 09:10:38 np0005548916.novalocal systemd-logind[788]: New session 6 of user zuul.
Dec 06 09:10:38 np0005548916.novalocal systemd[1]: Started Session 6 of User zuul.
Dec 06 09:10:38 np0005548916.novalocal sshd-session[8090]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:10:38 np0005548916.novalocal sudo[8117]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wygqnuxraapplhoabpjjrantpjrdollu ; /usr/bin/python3'
Dec 06 09:10:38 np0005548916.novalocal sudo[8117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:10:38 np0005548916.novalocal python3[8119]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 09:10:53 np0005548916.novalocal kernel: SELinux:  Converting 386 SID table entries...
Dec 06 09:10:53 np0005548916.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:10:53 np0005548916.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 06 09:10:53 np0005548916.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:10:53 np0005548916.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:10:53 np0005548916.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:10:53 np0005548916.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:10:53 np0005548916.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:11:03 np0005548916.novalocal kernel: SELinux:  Converting 386 SID table entries...
Dec 06 09:11:03 np0005548916.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:11:03 np0005548916.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 06 09:11:03 np0005548916.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:11:03 np0005548916.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:11:03 np0005548916.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:11:03 np0005548916.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:11:03 np0005548916.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:11:12 np0005548916.novalocal kernel: SELinux:  Converting 386 SID table entries...
Dec 06 09:11:12 np0005548916.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:11:12 np0005548916.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 06 09:11:12 np0005548916.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:11:12 np0005548916.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:11:12 np0005548916.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:11:12 np0005548916.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:11:12 np0005548916.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:11:14 np0005548916.novalocal setsebool[8186]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 06 09:11:14 np0005548916.novalocal setsebool[8186]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 06 09:11:25 np0005548916.novalocal kernel: SELinux:  Converting 389 SID table entries...
Dec 06 09:11:25 np0005548916.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:11:25 np0005548916.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 06 09:11:25 np0005548916.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:11:25 np0005548916.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:11:25 np0005548916.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:11:25 np0005548916.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:11:25 np0005548916.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:11:43 np0005548916.novalocal dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 06 09:11:44 np0005548916.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:11:44 np0005548916.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 06 09:11:44 np0005548916.novalocal systemd[1]: Reloading.
Dec 06 09:11:44 np0005548916.novalocal systemd-rc-local-generator[8936]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:11:44 np0005548916.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 09:11:45 np0005548916.novalocal sudo[8117]: pam_unix(sudo:session): session closed for user root
Dec 06 09:11:47 np0005548916.novalocal python3[11217]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163efc-24cc-d561-0a5b-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:11:48 np0005548916.novalocal kernel: evm: overlay not supported
Dec 06 09:11:48 np0005548916.novalocal systemd[4303]: Starting D-Bus User Message Bus...
Dec 06 09:11:48 np0005548916.novalocal dbus-broker-launch[12169]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 06 09:11:48 np0005548916.novalocal dbus-broker-launch[12169]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 06 09:11:48 np0005548916.novalocal systemd[4303]: Started D-Bus User Message Bus.
Dec 06 09:11:48 np0005548916.novalocal dbus-broker-lau[12169]: Ready
Dec 06 09:11:48 np0005548916.novalocal systemd[4303]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 06 09:11:48 np0005548916.novalocal systemd[4303]: Created slice Slice /user.
Dec 06 09:11:48 np0005548916.novalocal systemd[4303]: podman-12070.scope: unit configures an IP firewall, but not running as root.
Dec 06 09:11:48 np0005548916.novalocal systemd[4303]: (This warning is only shown for the first unit using IP firewalling.)
Dec 06 09:11:48 np0005548916.novalocal systemd[4303]: Started podman-12070.scope.
Dec 06 09:11:48 np0005548916.novalocal systemd[4303]: Started podman-pause-792dbf6a.scope.
Dec 06 09:11:49 np0005548916.novalocal sshd-session[8093]: Connection closed by 38.102.83.114 port 36360
Dec 06 09:11:49 np0005548916.novalocal sshd-session[8090]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:11:49 np0005548916.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Dec 06 09:11:49 np0005548916.novalocal systemd[1]: session-6.scope: Consumed 1min 3.444s CPU time.
Dec 06 09:11:49 np0005548916.novalocal systemd-logind[788]: Session 6 logged out. Waiting for processes to exit.
Dec 06 09:11:49 np0005548916.novalocal systemd-logind[788]: Removed session 6.
Dec 06 09:12:13 np0005548916.novalocal sshd-session[22237]: Connection closed by 38.102.83.98 port 55758 [preauth]
Dec 06 09:12:13 np0005548916.novalocal sshd-session[22239]: Connection closed by 38.102.83.98 port 55772 [preauth]
Dec 06 09:12:13 np0005548916.novalocal sshd-session[22240]: Unable to negotiate with 38.102.83.98 port 55788: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 06 09:12:13 np0005548916.novalocal sshd-session[22245]: Unable to negotiate with 38.102.83.98 port 55794: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 06 09:12:13 np0005548916.novalocal sshd-session[22244]: Unable to negotiate with 38.102.83.98 port 55808: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 06 09:12:16 np0005548916.novalocal sshd-session[23135]: Received disconnect from 193.46.255.159 port 62972:11:  [preauth]
Dec 06 09:12:16 np0005548916.novalocal sshd-session[23135]: Disconnected from authenticating user root 193.46.255.159 port 62972 [preauth]
Dec 06 09:12:18 np0005548916.novalocal sshd-session[23865]: Accepted publickey for zuul from 38.102.83.114 port 44858 ssh2: RSA SHA256:spwPcL19sPHC+yJA+ECEA4UNmpshOiR8KfgtTbViJeA
Dec 06 09:12:18 np0005548916.novalocal systemd-logind[788]: New session 7 of user zuul.
Dec 06 09:12:18 np0005548916.novalocal systemd[1]: Started Session 7 of User zuul.
Dec 06 09:12:18 np0005548916.novalocal sshd-session[23865]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:12:18 np0005548916.novalocal python3[23971]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK/b/hDus+zgErbxpiAu4axJ55LMjNixMhoE4DoEU6Wq/xn30MdVWwMPMhgQamY6n3JqihnzwOz1OzKhBTCdzls= zuul@np0005548914.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:12:18 np0005548916.novalocal sudo[24141]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrugxqzesmlbvwsxgisnwzfscibauihm ; /usr/bin/python3'
Dec 06 09:12:18 np0005548916.novalocal sudo[24141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:12:19 np0005548916.novalocal python3[24155]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK/b/hDus+zgErbxpiAu4axJ55LMjNixMhoE4DoEU6Wq/xn30MdVWwMPMhgQamY6n3JqihnzwOz1OzKhBTCdzls= zuul@np0005548914.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:12:19 np0005548916.novalocal sudo[24141]: pam_unix(sudo:session): session closed for user root
Dec 06 09:12:19 np0005548916.novalocal sudo[24478]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iocfvwodvhagxnfstktbsybmhopqhney ; /usr/bin/python3'
Dec 06 09:12:19 np0005548916.novalocal sudo[24478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:12:20 np0005548916.novalocal python3[24489]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005548916.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 06 09:12:20 np0005548916.novalocal useradd[24569]: new group: name=cloud-admin, GID=1002
Dec 06 09:12:20 np0005548916.novalocal useradd[24569]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Dec 06 09:12:20 np0005548916.novalocal sudo[24478]: pam_unix(sudo:session): session closed for user root
Dec 06 09:12:20 np0005548916.novalocal sudo[24700]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qskxesxilanahkpudfvwfuhwcifxrysm ; /usr/bin/python3'
Dec 06 09:12:20 np0005548916.novalocal sudo[24700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:12:20 np0005548916.novalocal python3[24712]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK/b/hDus+zgErbxpiAu4axJ55LMjNixMhoE4DoEU6Wq/xn30MdVWwMPMhgQamY6n3JqihnzwOz1OzKhBTCdzls= zuul@np0005548914.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 06 09:12:20 np0005548916.novalocal sudo[24700]: pam_unix(sudo:session): session closed for user root
Dec 06 09:12:21 np0005548916.novalocal sudo[24990]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyuvkdsvhwbtqfdizfijnqhxnfpkxouw ; /usr/bin/python3'
Dec 06 09:12:21 np0005548916.novalocal sudo[24990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:12:21 np0005548916.novalocal python3[25001]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:12:21 np0005548916.novalocal sudo[24990]: pam_unix(sudo:session): session closed for user root
Dec 06 09:12:21 np0005548916.novalocal sudo[25267]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgenlkropcszevdsofiklgjjxlpnncrg ; /usr/bin/python3'
Dec 06 09:12:21 np0005548916.novalocal sudo[25267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:12:21 np0005548916.novalocal python3[25276]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765012340.8469179-152-189411988683337/source _original_basename=tmpufrff3wp follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:12:21 np0005548916.novalocal sudo[25267]: pam_unix(sudo:session): session closed for user root
Dec 06 09:12:22 np0005548916.novalocal sudo[25589]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqflocessrnugzlakywfaaqnmxphgoti ; /usr/bin/python3'
Dec 06 09:12:22 np0005548916.novalocal sudo[25589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:12:22 np0005548916.novalocal python3[25600]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Dec 06 09:12:22 np0005548916.novalocal systemd[1]: Starting Hostname Service...
Dec 06 09:12:22 np0005548916.novalocal systemd[1]: Started Hostname Service.
Dec 06 09:12:22 np0005548916.novalocal systemd-hostnamed[25700]: Changed pretty hostname to 'compute-1'
Dec 06 09:12:22 compute-1 systemd-hostnamed[25700]: Hostname set to <compute-1> (static)
Dec 06 09:12:22 compute-1 NetworkManager[7209]: <info>  [1765012342.6127] hostname: static hostname changed from "np0005548916.novalocal" to "compute-1"
Dec 06 09:12:22 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 09:12:22 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 09:12:22 compute-1 sudo[25589]: pam_unix(sudo:session): session closed for user root
Dec 06 09:12:23 compute-1 sshd-session[23914]: Connection closed by 38.102.83.114 port 44858
Dec 06 09:12:23 compute-1 sshd-session[23865]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:12:23 compute-1 systemd-logind[788]: Session 7 logged out. Waiting for processes to exit.
Dec 06 09:12:23 compute-1 systemd[1]: session-7.scope: Deactivated successfully.
Dec 06 09:12:23 compute-1 systemd[1]: session-7.scope: Consumed 2.503s CPU time.
Dec 06 09:12:23 compute-1 systemd-logind[788]: Removed session 7.
Dec 06 09:12:24 compute-1 irqbalance[783]: Cannot change IRQ 26 affinity: Operation not permitted
Dec 06 09:12:24 compute-1 irqbalance[783]: IRQ 26 affinity is now unmanaged
Dec 06 09:12:32 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 09:12:34 compute-1 irqbalance[783]: Cannot change IRQ 27 affinity: Operation not permitted
Dec 06 09:12:34 compute-1 irqbalance[783]: IRQ 27 affinity is now unmanaged
Dec 06 09:12:34 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 09:12:34 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 06 09:12:34 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1min 2.454s CPU time.
Dec 06 09:12:34 compute-1 systemd[1]: run-r3c7901b7490f4b96aebac3fa603e46bc.service: Deactivated successfully.
Dec 06 09:12:52 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 06 09:14:10 compute-1 sshd-session[29917]: Connection closed by 122.114.225.205 port 33096 [preauth]
Dec 06 09:15:48 compute-1 systemd[1]: Starting Cleanup of Temporary Directories...
Dec 06 09:15:48 compute-1 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 06 09:15:48 compute-1 systemd[1]: Finished Cleanup of Temporary Directories.
Dec 06 09:15:48 compute-1 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 06 09:16:07 compute-1 sshd-session[29925]: Accepted publickey for zuul from 38.102.83.98 port 37918 ssh2: RSA SHA256:spwPcL19sPHC+yJA+ECEA4UNmpshOiR8KfgtTbViJeA
Dec 06 09:16:07 compute-1 systemd-logind[788]: New session 8 of user zuul.
Dec 06 09:16:07 compute-1 systemd[1]: Started Session 8 of User zuul.
Dec 06 09:16:07 compute-1 sshd-session[29925]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:16:07 compute-1 python3[30001]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:16:10 compute-1 sudo[30115]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nykrayayrjwysaqbiavpogukclknlwfh ; /usr/bin/python3'
Dec 06 09:16:10 compute-1 sudo[30115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:10 compute-1 python3[30117]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:16:10 compute-1 sudo[30115]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:10 compute-1 sudo[30188]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cowoskehbzrszkfzexbtmqqyzytrwjzi ; /usr/bin/python3'
Dec 06 09:16:10 compute-1 sudo[30188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:10 compute-1 python3[30190]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.2087073-33925-92850036764109/source mode=0755 _original_basename=delorean.repo follow=False checksum=39c885eb875fd03e010d1b0454241c26b121dfb2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:16:10 compute-1 sudo[30188]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:11 compute-1 sudo[30214]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxxnehwypjmyonfnhemrobduvfeidysb ; /usr/bin/python3'
Dec 06 09:16:11 compute-1 sudo[30214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:11 compute-1 python3[30216]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:16:11 compute-1 sudo[30214]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:11 compute-1 sudo[30287]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynzhyspzdcqxxuhiefsyljwznapyrzpe ; /usr/bin/python3'
Dec 06 09:16:11 compute-1 sudo[30287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:11 compute-1 python3[30289]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.2087073-33925-92850036764109/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:16:11 compute-1 sudo[30287]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:11 compute-1 sudo[30313]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rckjoxrnoyaexugtlohhhvnmylsltqmu ; /usr/bin/python3'
Dec 06 09:16:11 compute-1 sudo[30313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:11 compute-1 python3[30315]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:16:11 compute-1 sudo[30313]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:12 compute-1 sudo[30386]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xolzxrecondiymhymepigwawvroboxzb ; /usr/bin/python3'
Dec 06 09:16:12 compute-1 sudo[30386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:12 compute-1 python3[30388]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.2087073-33925-92850036764109/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:16:12 compute-1 sudo[30386]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:12 compute-1 sudo[30412]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfwasxtckzolsomtgpsswshfxgazzdxa ; /usr/bin/python3'
Dec 06 09:16:12 compute-1 sudo[30412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:12 compute-1 python3[30414]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:16:12 compute-1 sudo[30412]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:12 compute-1 sudo[30485]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amwlpburhijtqzdvufiuwvpsttdyflos ; /usr/bin/python3'
Dec 06 09:16:12 compute-1 sudo[30485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:12 compute-1 python3[30487]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.2087073-33925-92850036764109/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:16:12 compute-1 sudo[30485]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:13 compute-1 sudo[30511]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haxyqxfatfpvyyttionybwdhvxwvgmsg ; /usr/bin/python3'
Dec 06 09:16:13 compute-1 sudo[30511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:13 compute-1 python3[30513]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:16:13 compute-1 sudo[30511]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:13 compute-1 sudo[30584]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kubwlhnlonfvfwwfthbcwzdpyocskqsk ; /usr/bin/python3'
Dec 06 09:16:13 compute-1 sudo[30584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:13 compute-1 python3[30586]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.2087073-33925-92850036764109/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:16:13 compute-1 sudo[30584]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:13 compute-1 sudo[30610]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcmuscvalzgqulshjowvbihwmkhslcxg ; /usr/bin/python3'
Dec 06 09:16:13 compute-1 sudo[30610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:13 compute-1 python3[30612]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:16:13 compute-1 sudo[30610]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:14 compute-1 sudo[30683]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alwjqnfmgamobyjqsvlrtmrrpuwndgpl ; /usr/bin/python3'
Dec 06 09:16:14 compute-1 sudo[30683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:14 compute-1 python3[30685]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.2087073-33925-92850036764109/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:16:14 compute-1 sudo[30683]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:14 compute-1 sudo[30709]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwikutyqcgpcyssupdxgywwszejjmnjk ; /usr/bin/python3'
Dec 06 09:16:14 compute-1 sudo[30709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:14 compute-1 python3[30711]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:16:14 compute-1 sudo[30709]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:14 compute-1 sudo[30782]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fheqfmntqjmxxdkhzdqgtynttnwuscew ; /usr/bin/python3'
Dec 06 09:16:14 compute-1 sudo[30782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:16:15 compute-1 python3[30784]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765012570.2087073-33925-92850036764109/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6e18e2038d54303b4926db53c0b6cced515a9151 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:16:15 compute-1 sudo[30782]: pam_unix(sudo:session): session closed for user root
Dec 06 09:16:26 compute-1 python3[30832]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:21:01 compute-1 anacron[4460]: Job `cron.daily' started
Dec 06 09:21:01 compute-1 anacron[4460]: Job `cron.daily' terminated
Dec 06 09:21:26 compute-1 sshd-session[29928]: Received disconnect from 38.102.83.98 port 37918:11: disconnected by user
Dec 06 09:21:26 compute-1 sshd-session[29928]: Disconnected from user zuul 38.102.83.98 port 37918
Dec 06 09:21:26 compute-1 sshd-session[29925]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:21:26 compute-1 systemd[1]: session-8.scope: Deactivated successfully.
Dec 06 09:21:26 compute-1 systemd[1]: session-8.scope: Consumed 5.601s CPU time.
Dec 06 09:21:26 compute-1 systemd-logind[788]: Session 8 logged out. Waiting for processes to exit.
Dec 06 09:21:26 compute-1 systemd-logind[788]: Removed session 8.
Dec 06 09:23:38 compute-1 systemd[1]: Starting dnf makecache...
Dec 06 09:23:38 compute-1 dnf[30838]: Failed determining last makecache time.
Dec 06 09:23:38 compute-1 dnf[30838]: delorean-openstack-barbican-42b4c41831408a8e323 405 kB/s |  13 kB     00:00
Dec 06 09:23:38 compute-1 dnf[30838]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 3.3 MB/s |  65 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.4 MB/s |  32 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-python-stevedore-c4acc5639fd2329372142 6.4 MB/s | 131 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-python-cloudkitty-tests-tempest-2c80f8 1.4 MB/s |  32 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-os-net-config-d0cedbdb788d43e5c7551df5  13 MB/s | 349 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 1.8 MB/s |  42 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-python-designate-tests-tempest-347fdbc 851 kB/s |  18 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-openstack-glance-1fd12c29b339f30fe823e 888 kB/s |  18 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.2 MB/s |  29 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-openstack-manila-3c01b7181572c95dac462 622 kB/s |  25 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-python-whitebox-neutron-tests-tempest- 6.9 MB/s | 154 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-openstack-octavia-ba397f07a7331190208c 1.2 MB/s |  26 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-openstack-watcher-c014f81a8647287f6dcc 771 kB/s |  16 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-ansible-config_template-5ccaa22121a7ff 373 kB/s | 7.4 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 7.0 MB/s | 144 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-openstack-swift-dc98a8463506ac520c469a 622 kB/s |  14 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-python-tempestconf-8515371b7cceebd4282 2.8 MB/s |  53 kB     00:00
Dec 06 09:23:39 compute-1 dnf[30838]: delorean-openstack-heat-ui-013accbfd179753bc3f0 4.3 MB/s |  96 kB     00:00
Dec 06 09:23:40 compute-1 dnf[30838]: CentOS Stream 9 - BaseOS                         24 kB/s | 7.3 kB     00:00
Dec 06 09:23:40 compute-1 dnf[30838]: CentOS Stream 9 - AppStream                      48 kB/s | 7.4 kB     00:00
Dec 06 09:23:40 compute-1 dnf[30838]: CentOS Stream 9 - CRB                            32 kB/s | 7.2 kB     00:00
Dec 06 09:23:41 compute-1 dnf[30838]: CentOS Stream 9 - Extras packages                28 kB/s | 8.3 kB     00:00
Dec 06 09:23:41 compute-1 dnf[30838]: dlrn-antelope-testing                            27 MB/s | 1.1 MB     00:00
Dec 06 09:23:41 compute-1 dnf[30838]: dlrn-antelope-build-deps                         16 MB/s | 461 kB     00:00
Dec 06 09:23:41 compute-1 dnf[30838]: centos9-rabbitmq                                8.1 MB/s | 123 kB     00:00
Dec 06 09:23:41 compute-1 dnf[30838]: centos9-storage                                  20 MB/s | 415 kB     00:00
Dec 06 09:23:41 compute-1 dnf[30838]: centos9-opstools                                3.3 MB/s |  51 kB     00:00
Dec 06 09:23:42 compute-1 dnf[30838]: NFV SIG OpenvSwitch                              26 MB/s | 456 kB     00:00
Dec 06 09:23:42 compute-1 dnf[30838]: repo-setup-centos-appstream                      71 MB/s |  25 MB     00:00
Dec 06 09:23:48 compute-1 dnf[30838]: repo-setup-centos-baseos                         70 MB/s | 8.8 MB     00:00
Dec 06 09:23:50 compute-1 dnf[30838]: repo-setup-centos-highavailability               36 MB/s | 744 kB     00:00
Dec 06 09:23:50 compute-1 dnf[30838]: repo-setup-centos-powertools                     76 MB/s | 7.3 MB     00:00
Dec 06 09:24:03 compute-1 dnf[30838]: Extra Packages for Enterprise Linux 9 - x86_64  1.8 MB/s |  20 MB     00:11
Dec 06 09:24:23 compute-1 dnf[30838]: Metadata cache created.
Dec 06 09:24:23 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 06 09:24:23 compute-1 systemd[1]: Finished dnf makecache.
Dec 06 09:24:23 compute-1 systemd[1]: dnf-makecache.service: Consumed 31.729s CPU time.
Dec 06 09:27:49 compute-1 sshd-session[30942]: Received disconnect from 193.46.255.7 port 26052:11:  [preauth]
Dec 06 09:27:49 compute-1 sshd-session[30942]: Disconnected from authenticating user root 193.46.255.7 port 26052 [preauth]
Dec 06 09:27:56 compute-1 sshd-session[30944]: Accepted publickey for zuul from 192.168.122.30 port 60398 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:27:56 compute-1 systemd-logind[788]: New session 9 of user zuul.
Dec 06 09:27:56 compute-1 systemd[1]: Started Session 9 of User zuul.
Dec 06 09:27:56 compute-1 sshd-session[30944]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:27:57 compute-1 python3.9[31097]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:27:58 compute-1 sudo[31276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqhgzjswitfpuvpnychcrncydwgpusff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013278.3548117-57-78474160565331/AnsiballZ_command.py'
Dec 06 09:27:58 compute-1 sudo[31276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:27:59 compute-1 python3.9[31278]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:28:06 compute-1 sudo[31276]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:06 compute-1 sshd-session[30947]: Connection closed by 192.168.122.30 port 60398
Dec 06 09:28:06 compute-1 sshd-session[30944]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:28:06 compute-1 systemd[1]: session-9.scope: Deactivated successfully.
Dec 06 09:28:06 compute-1 systemd[1]: session-9.scope: Consumed 8.417s CPU time.
Dec 06 09:28:06 compute-1 systemd-logind[788]: Session 9 logged out. Waiting for processes to exit.
Dec 06 09:28:06 compute-1 systemd-logind[788]: Removed session 9.
Dec 06 09:28:22 compute-1 sshd-session[31335]: Accepted publickey for zuul from 192.168.122.30 port 33152 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:28:22 compute-1 systemd-logind[788]: New session 10 of user zuul.
Dec 06 09:28:22 compute-1 systemd[1]: Started Session 10 of User zuul.
Dec 06 09:28:22 compute-1 sshd-session[31335]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:28:22 compute-1 python3.9[31488]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 06 09:28:24 compute-1 python3.9[31662]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:28:24 compute-1 sudo[31812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmxvivjlvuwjqvqwhuqffbjiybidmjub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013304.4801471-94-158092490274407/AnsiballZ_command.py'
Dec 06 09:28:24 compute-1 sudo[31812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:25 compute-1 python3.9[31814]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:28:25 compute-1 sudo[31812]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:25 compute-1 sudo[31965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnrrjwjwdazgmoqgvtpbrilzxtrxmloe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013305.5224898-130-242710386354316/AnsiballZ_stat.py'
Dec 06 09:28:25 compute-1 sudo[31965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:26 compute-1 python3.9[31967]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:28:26 compute-1 sudo[31965]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:26 compute-1 sudo[32117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldizjzbggqkadpyhuydfusrfowmyoypp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013306.317952-154-132958936172945/AnsiballZ_file.py'
Dec 06 09:28:26 compute-1 sudo[32117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:27 compute-1 python3.9[32119]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:28:27 compute-1 sudo[32117]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:27 compute-1 sudo[32269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qonxjdrajyaghqnsbjbzlemzdhupsrjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013307.3222613-178-89372333289099/AnsiballZ_stat.py'
Dec 06 09:28:27 compute-1 sudo[32269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:27 compute-1 python3.9[32271]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:28:27 compute-1 sudo[32269]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:28 compute-1 sudo[32392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvblrtsorxkkmhdftfuapoeyznjdxxrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013307.3222613-178-89372333289099/AnsiballZ_copy.py'
Dec 06 09:28:28 compute-1 sudo[32392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:28 compute-1 python3.9[32394]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013307.3222613-178-89372333289099/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:28:28 compute-1 sudo[32392]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:29 compute-1 sudo[32544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kagdiaxequxugyhchyqsnbkpkvqfgoza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013308.7548182-223-235257235709785/AnsiballZ_setup.py'
Dec 06 09:28:29 compute-1 sudo[32544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:29 compute-1 python3.9[32546]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:28:29 compute-1 sudo[32544]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:30 compute-1 sudo[32700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldyfvlfalbpcvulhaviyzmqmdbwbreej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013309.7452004-247-23563547851275/AnsiballZ_file.py'
Dec 06 09:28:30 compute-1 sudo[32700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:30 compute-1 python3.9[32702]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:28:30 compute-1 sudo[32700]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:30 compute-1 sudo[32852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwfxebdcjjctwmhijtlzbcncyherumyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013310.5802414-274-16872433604739/AnsiballZ_file.py'
Dec 06 09:28:30 compute-1 sudo[32852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:31 compute-1 python3.9[32854]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:28:31 compute-1 sudo[32852]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:32 compute-1 python3.9[33004]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:28:39 compute-1 python3.9[33258]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:28:40 compute-1 python3.9[33408]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:28:41 compute-1 python3.9[33562]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:28:42 compute-1 sudo[33718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alhcaqddoukwzwlhtcjvxlttlqjocsqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013322.1572995-418-162233466879612/AnsiballZ_setup.py'
Dec 06 09:28:42 compute-1 sudo[33718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:42 compute-1 python3.9[33720]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:28:43 compute-1 sudo[33718]: pam_unix(sudo:session): session closed for user root
Dec 06 09:28:43 compute-1 sudo[33802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfgguvdpgicxarpizabicngtonvtsjbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013322.1572995-418-162233466879612/AnsiballZ_dnf.py'
Dec 06 09:28:43 compute-1 sudo[33802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:28:43 compute-1 python3.9[33804]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:29:28 compute-1 sshd-session[33949]: Received disconnect from 193.46.255.159 port 25512:11:  [preauth]
Dec 06 09:29:28 compute-1 sshd-session[33949]: Disconnected from authenticating user root 193.46.255.159 port 25512 [preauth]
Dec 06 09:29:34 compute-1 systemd[1]: Reloading.
Dec 06 09:29:34 compute-1 systemd-rc-local-generator[34003]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:29:34 compute-1 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 06 09:29:35 compute-1 systemd[1]: Reloading.
Dec 06 09:29:35 compute-1 systemd-rc-local-generator[34040]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:29:35 compute-1 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 06 09:29:35 compute-1 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 06 09:29:35 compute-1 systemd[1]: Reloading.
Dec 06 09:29:35 compute-1 systemd-rc-local-generator[34081]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:29:35 compute-1 systemd[1]: Listening on LVM2 poll daemon socket.
Dec 06 09:29:35 compute-1 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Dec 06 09:29:35 compute-1 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Dec 06 09:29:35 compute-1 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Dec 06 09:30:43 compute-1 kernel: SELinux:  Converting 2719 SID table entries...
Dec 06 09:30:43 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:30:43 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 06 09:30:43 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:30:43 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:30:43 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:30:43 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:30:43 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:30:43 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 06 09:30:44 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:30:44 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 06 09:30:44 compute-1 systemd[1]: Reloading.
Dec 06 09:30:44 compute-1 systemd-rc-local-generator[34411]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:30:44 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 09:30:44 compute-1 sudo[33802]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:45 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 09:30:45 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 06 09:30:45 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.336s CPU time.
Dec 06 09:30:45 compute-1 systemd[1]: run-r37da2513ed4f420f90551d7aff076297.service: Deactivated successfully.
Dec 06 09:30:45 compute-1 sudo[35330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-diickwffbrqdhfpqwcqbiqasccjnpivu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013445.1604009-455-106065773651953/AnsiballZ_command.py'
Dec 06 09:30:45 compute-1 sudo[35330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:45 compute-1 python3.9[35332]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:30:46 compute-1 sudo[35330]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:47 compute-1 sudo[35611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmvpmysvfiidryhnhgyszvtuqybgkczp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013447.1737425-478-102782996593230/AnsiballZ_selinux.py'
Dec 06 09:30:47 compute-1 sudo[35611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:48 compute-1 python3.9[35613]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 06 09:30:48 compute-1 sudo[35611]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:49 compute-1 sudo[35763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqyxiznajqvleplbtnywudiddzcojijy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013448.8393655-511-221508989080132/AnsiballZ_command.py'
Dec 06 09:30:49 compute-1 sudo[35763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:49 compute-1 python3.9[35765]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 06 09:30:53 compute-1 sudo[35763]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:54 compute-1 sudo[35916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcshbxdreuwuzogrvxigboxlayrluicl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013453.9329002-535-1949129390934/AnsiballZ_file.py'
Dec 06 09:30:54 compute-1 sudo[35916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:54 compute-1 python3.9[35918]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:30:54 compute-1 sudo[35916]: pam_unix(sudo:session): session closed for user root
Dec 06 09:30:55 compute-1 sudo[36068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhuwwueiyaksmmnopltquumnoetiiddo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013455.3217657-559-138508031101540/AnsiballZ_mount.py'
Dec 06 09:30:55 compute-1 sudo[36068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:30:56 compute-1 python3.9[36070]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 06 09:30:56 compute-1 sudo[36068]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:00 compute-1 sudo[36220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knsyklguawjrkszruqejqyeeukecugnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013459.8392172-643-158982001241950/AnsiballZ_file.py'
Dec 06 09:31:00 compute-1 sudo[36220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:04 compute-1 python3.9[36222]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:31:04 compute-1 sudo[36220]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:05 compute-1 sudo[36372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmdwcszmyntleczcsglnwobyrzkfzzkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013464.7308097-667-271352270742988/AnsiballZ_stat.py'
Dec 06 09:31:05 compute-1 sudo[36372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:05 compute-1 python3.9[36374]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:31:05 compute-1 sudo[36372]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:05 compute-1 sudo[36495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qebqechxzmnouvxqlqqvvtdnzrinxlxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013464.7308097-667-271352270742988/AnsiballZ_copy.py'
Dec 06 09:31:05 compute-1 sudo[36495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:05 compute-1 python3.9[36497]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013464.7308097-667-271352270742988/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:31:05 compute-1 sudo[36495]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:10 compute-1 sudo[36647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwidnornnrvwtmrisoggoqjanxvxvehs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013470.5362294-739-265367463952803/AnsiballZ_stat.py'
Dec 06 09:31:10 compute-1 sudo[36647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:10 compute-1 python3.9[36649]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:31:10 compute-1 sudo[36647]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:11 compute-1 sudo[36799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqyptmrxuyiaphtzbzjtezskmspeatrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013471.3723202-763-200375161441494/AnsiballZ_command.py'
Dec 06 09:31:11 compute-1 sudo[36799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:11 compute-1 python3.9[36801]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:31:11 compute-1 sudo[36799]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:12 compute-1 sudo[36952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzqwmmjcirnbhfnezxnmtfdeyfszdvpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013472.3910532-787-130150140806254/AnsiballZ_file.py'
Dec 06 09:31:12 compute-1 sudo[36952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:12 compute-1 python3.9[36954]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:31:12 compute-1 sudo[36952]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:13 compute-1 sudo[37104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnncfavbiqrvuvupnleuveqgxiypzbts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013473.455012-820-203302213546229/AnsiballZ_getent.py'
Dec 06 09:31:13 compute-1 sudo[37104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:14 compute-1 python3.9[37106]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 06 09:31:14 compute-1 sudo[37104]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:14 compute-1 sudo[37257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fegqdswybvuhdvyopphqqhyacovxliav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013474.3545718-844-178496978224890/AnsiballZ_group.py'
Dec 06 09:31:14 compute-1 sudo[37257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:15 compute-1 python3.9[37259]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 06 09:31:15 compute-1 groupadd[37260]: group added to /etc/group: name=qemu, GID=107
Dec 06 09:31:15 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:31:15 compute-1 groupadd[37260]: group added to /etc/gshadow: name=qemu
Dec 06 09:31:15 compute-1 groupadd[37260]: new group: name=qemu, GID=107
Dec 06 09:31:15 compute-1 sudo[37257]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:15 compute-1 sudo[37416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yguzztkshjhqvwxvclomalgvsmfzxgka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013475.3840966-868-32432434251147/AnsiballZ_user.py'
Dec 06 09:31:15 compute-1 sudo[37416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:16 compute-1 python3.9[37418]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 06 09:31:16 compute-1 useradd[37420]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Dec 06 09:31:16 compute-1 sudo[37416]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:16 compute-1 sudo[37576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltlmjqjgtohwbxrkspnzdqizlvumvorn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013476.559411-892-105533134332590/AnsiballZ_getent.py'
Dec 06 09:31:16 compute-1 sudo[37576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:17 compute-1 python3.9[37578]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 06 09:31:17 compute-1 sudo[37576]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:17 compute-1 sudo[37729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skhjvihxlskrwipekmoeywradfzojkpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013477.4735944-916-45304121667872/AnsiballZ_group.py'
Dec 06 09:31:17 compute-1 sudo[37729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:17 compute-1 python3.9[37731]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 06 09:31:17 compute-1 groupadd[37732]: group added to /etc/group: name=hugetlbfs, GID=42477
Dec 06 09:31:17 compute-1 groupadd[37732]: group added to /etc/gshadow: name=hugetlbfs
Dec 06 09:31:18 compute-1 groupadd[37732]: new group: name=hugetlbfs, GID=42477
Dec 06 09:31:18 compute-1 sudo[37729]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:18 compute-1 sudo[37887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgdfmpbvnfmedratscohethkdedvnoot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013478.4713435-943-58422966433318/AnsiballZ_file.py'
Dec 06 09:31:18 compute-1 sudo[37887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:18 compute-1 python3.9[37889]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 06 09:31:18 compute-1 sudo[37887]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:19 compute-1 sudo[38039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuojlafbxwaplzhhenxalbzddgrzlbki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013479.517794-976-103482750706860/AnsiballZ_dnf.py'
Dec 06 09:31:19 compute-1 sudo[38039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:20 compute-1 python3.9[38041]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:31:22 compute-1 sudo[38039]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:23 compute-1 sudo[38192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrncghiqkiaqlclhlstmzdbtlxmaxysf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013482.8154154-1000-53145255055686/AnsiballZ_file.py'
Dec 06 09:31:23 compute-1 sudo[38192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:23 compute-1 python3.9[38194]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:31:23 compute-1 sudo[38192]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:23 compute-1 sudo[38344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcfewkrkbbpaktyqaufhmftzdnihywrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013483.6196544-1024-183795242943665/AnsiballZ_stat.py'
Dec 06 09:31:23 compute-1 sudo[38344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:24 compute-1 python3.9[38346]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:31:24 compute-1 sudo[38344]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:24 compute-1 sudo[38467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvqzyfcgutogskstwaerrfgxbjosgbhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013483.6196544-1024-183795242943665/AnsiballZ_copy.py'
Dec 06 09:31:24 compute-1 sudo[38467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:24 compute-1 python3.9[38469]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013483.6196544-1024-183795242943665/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:31:24 compute-1 sudo[38467]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:25 compute-1 sudo[38619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsfrlugdjiphtdxpvjlanedpvsmxjcfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013485.0208585-1069-232477403157291/AnsiballZ_systemd.py'
Dec 06 09:31:25 compute-1 sudo[38619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:26 compute-1 python3.9[38621]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:31:26 compute-1 systemd[1]: Starting Load Kernel Modules...
Dec 06 09:31:26 compute-1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 06 09:31:26 compute-1 kernel: Bridge firewalling registered
Dec 06 09:31:26 compute-1 systemd-modules-load[38625]: Inserted module 'br_netfilter'
Dec 06 09:31:26 compute-1 systemd[1]: Finished Load Kernel Modules.
Dec 06 09:31:26 compute-1 sudo[38619]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:26 compute-1 sudo[38778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efbsdkjxubjrzkbfowdqdmmuuvaumgbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013486.4231775-1094-53405985451923/AnsiballZ_stat.py'
Dec 06 09:31:26 compute-1 sudo[38778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:27 compute-1 python3.9[38780]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:31:27 compute-1 sudo[38778]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:27 compute-1 sudo[38901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epwfnhbifhzlubloazpqbgfrydisygqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013486.4231775-1094-53405985451923/AnsiballZ_copy.py'
Dec 06 09:31:27 compute-1 sudo[38901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:27 compute-1 python3.9[38903]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013486.4231775-1094-53405985451923/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:31:28 compute-1 sudo[38901]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:28 compute-1 sudo[39053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnzxbhvtasvqhbhpajoyqdlrgphkxksu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013488.4933007-1147-20658457041141/AnsiballZ_dnf.py'
Dec 06 09:31:28 compute-1 sudo[39053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:29 compute-1 python3.9[39055]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:31:34 compute-1 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Dec 06 09:31:34 compute-1 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Dec 06 09:31:35 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:31:35 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 06 09:31:35 compute-1 systemd[1]: Reloading.
Dec 06 09:31:35 compute-1 systemd-rc-local-generator[39113]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:31:35 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 09:31:36 compute-1 sudo[39053]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:37 compute-1 python3.9[40396]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:31:38 compute-1 python3.9[41231]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 06 09:31:38 compute-1 python3.9[41968]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:31:39 compute-1 sudo[43017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewmtlbjrkkdbkyixnvnkxdqlzaqbarep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013499.4740505-1264-150673521528390/AnsiballZ_command.py'
Dec 06 09:31:39 compute-1 sudo[43017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:39 compute-1 python3.9[43042]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:31:40 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 06 09:31:40 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 09:31:40 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 06 09:31:40 compute-1 systemd[1]: man-db-cache-update.service: Consumed 5.841s CPU time.
Dec 06 09:31:40 compute-1 systemd[1]: run-r9724e0f38af84b0a8264b851209db721.service: Deactivated successfully.
Dec 06 09:31:40 compute-1 systemd[1]: Starting Authorization Manager...
Dec 06 09:31:40 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 06 09:31:40 compute-1 polkitd[43440]: Started polkitd version 0.117
Dec 06 09:31:40 compute-1 polkitd[43440]: Loading rules from directory /etc/polkit-1/rules.d
Dec 06 09:31:40 compute-1 polkitd[43440]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 06 09:31:40 compute-1 polkitd[43440]: Finished loading, compiling and executing 2 rules
Dec 06 09:31:40 compute-1 systemd[1]: Started Authorization Manager.
Dec 06 09:31:40 compute-1 polkitd[43440]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Dec 06 09:31:40 compute-1 sudo[43017]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:41 compute-1 sudo[43608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhmrkfxgeurfnpwkwrvtfghruqjzvrvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013501.1952772-1291-220058672541926/AnsiballZ_systemd.py'
Dec 06 09:31:41 compute-1 sudo[43608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:41 compute-1 python3.9[43610]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:31:41 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 06 09:31:41 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Dec 06 09:31:41 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 06 09:31:41 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 06 09:31:42 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 06 09:31:42 compute-1 sudo[43608]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:43 compute-1 python3.9[43772]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 06 09:31:46 compute-1 sudo[43922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtmzvbrizhunpjbvxieuvmqkpfpscosn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013506.1934042-1462-225486008014179/AnsiballZ_systemd.py'
Dec 06 09:31:46 compute-1 sudo[43922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:46 compute-1 python3.9[43924]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:31:46 compute-1 systemd[1]: Reloading.
Dec 06 09:31:47 compute-1 systemd-rc-local-generator[43955]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:31:47 compute-1 sudo[43922]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:47 compute-1 sudo[44112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dryupdknnluznomekgbdwtqgcycvcmyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013507.3211064-1462-45342808953697/AnsiballZ_systemd.py'
Dec 06 09:31:47 compute-1 sudo[44112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:47 compute-1 python3.9[44114]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:31:47 compute-1 systemd[1]: Reloading.
Dec 06 09:31:48 compute-1 systemd-rc-local-generator[44139]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:31:48 compute-1 sudo[44112]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:48 compute-1 sudo[44302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhaoctstcfhfvtyuzqdwggsmplwxgnow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013508.6463418-1510-266458103212242/AnsiballZ_command.py'
Dec 06 09:31:48 compute-1 sudo[44302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:49 compute-1 python3.9[44304]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:31:49 compute-1 sudo[44302]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:49 compute-1 sudo[44455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmwbyyqkrntmbjqgcocecagzfjsmktcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013509.4714882-1534-94747373650681/AnsiballZ_command.py'
Dec 06 09:31:49 compute-1 sudo[44455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:49 compute-1 python3.9[44457]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:31:49 compute-1 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec 06 09:31:49 compute-1 sudo[44455]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:50 compute-1 sudo[44608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omwrjpwaeqzjctxjpivuvujnrblijnwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013510.3647888-1559-90306237811130/AnsiballZ_command.py'
Dec 06 09:31:50 compute-1 sudo[44608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:50 compute-1 python3.9[44610]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:31:52 compute-1 sudo[44608]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:52 compute-1 sudo[44770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwphsctyokpcizipvxylixkgwkqrlrsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013512.5962856-1582-117557911973406/AnsiballZ_command.py'
Dec 06 09:31:52 compute-1 sudo[44770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:53 compute-1 python3.9[44772]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:31:53 compute-1 sudo[44770]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:53 compute-1 sudo[44923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfflmcsxaaxjawykguyuoxzynymfgett ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013513.4273527-1606-87057847556672/AnsiballZ_systemd.py'
Dec 06 09:31:53 compute-1 sudo[44923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:31:54 compute-1 python3.9[44925]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:31:54 compute-1 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 06 09:31:54 compute-1 systemd[1]: Stopped Apply Kernel Variables.
Dec 06 09:31:54 compute-1 systemd[1]: Stopping Apply Kernel Variables...
Dec 06 09:31:54 compute-1 systemd[1]: Starting Apply Kernel Variables...
Dec 06 09:31:54 compute-1 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 06 09:31:54 compute-1 systemd[1]: Finished Apply Kernel Variables.
Dec 06 09:31:54 compute-1 sudo[44923]: pam_unix(sudo:session): session closed for user root
Dec 06 09:31:54 compute-1 sshd-session[31338]: Connection closed by 192.168.122.30 port 33152
Dec 06 09:31:54 compute-1 sshd-session[31335]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:31:54 compute-1 systemd[1]: session-10.scope: Deactivated successfully.
Dec 06 09:31:54 compute-1 systemd[1]: session-10.scope: Consumed 2min 29.219s CPU time.
Dec 06 09:31:54 compute-1 systemd-logind[788]: Session 10 logged out. Waiting for processes to exit.
Dec 06 09:31:54 compute-1 systemd-logind[788]: Removed session 10.
Dec 06 09:32:00 compute-1 sshd-session[44955]: Accepted publickey for zuul from 192.168.122.30 port 34486 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:32:00 compute-1 systemd-logind[788]: New session 11 of user zuul.
Dec 06 09:32:00 compute-1 systemd[1]: Started Session 11 of User zuul.
Dec 06 09:32:00 compute-1 sshd-session[44955]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:32:01 compute-1 python3.9[45108]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:32:02 compute-1 sudo[45262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxlvlvpcgbzthhzsrvenlmxsawzfkiis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013522.159675-69-101488321768305/AnsiballZ_getent.py'
Dec 06 09:32:02 compute-1 sudo[45262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:02 compute-1 python3.9[45264]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 06 09:32:02 compute-1 sudo[45262]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:03 compute-1 sudo[45415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rexflxdagpllyqiogfbqqinfnqfugpqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013523.1870105-93-10203500119480/AnsiballZ_group.py'
Dec 06 09:32:03 compute-1 sudo[45415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:03 compute-1 python3.9[45417]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 06 09:32:03 compute-1 groupadd[45418]: group added to /etc/group: name=openvswitch, GID=42476
Dec 06 09:32:03 compute-1 groupadd[45418]: group added to /etc/gshadow: name=openvswitch
Dec 06 09:32:03 compute-1 groupadd[45418]: new group: name=openvswitch, GID=42476
Dec 06 09:32:03 compute-1 sudo[45415]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:04 compute-1 sudo[45573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnepuuuzknmvjfszflwbfcsebshiwhcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013524.3278244-117-106876929172018/AnsiballZ_user.py'
Dec 06 09:32:04 compute-1 sudo[45573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:05 compute-1 python3.9[45575]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 06 09:32:05 compute-1 useradd[45577]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Dec 06 09:32:05 compute-1 useradd[45577]: add 'openvswitch' to group 'hugetlbfs'
Dec 06 09:32:05 compute-1 useradd[45577]: add 'openvswitch' to shadow group 'hugetlbfs'
Dec 06 09:32:05 compute-1 sudo[45573]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:05 compute-1 sudo[45733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwfaiepldbraqgshgdjoepscprhnkwis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013525.6245387-147-46536011210672/AnsiballZ_setup.py'
Dec 06 09:32:05 compute-1 sudo[45733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:06 compute-1 python3.9[45735]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:32:06 compute-1 sudo[45733]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:06 compute-1 sudo[45817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lenfdynvgeophvtpiwcirxttbxblupvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013525.6245387-147-46536011210672/AnsiballZ_dnf.py'
Dec 06 09:32:06 compute-1 sudo[45817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:07 compute-1 python3.9[45819]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:32:13 compute-1 sudo[45817]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:13 compute-1 sudo[45981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkrydvgxjgrmaeuugxbeqpqllbjtmnuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013533.5821617-189-263369375446597/AnsiballZ_dnf.py'
Dec 06 09:32:13 compute-1 sudo[45981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:14 compute-1 python3.9[45983]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:32:28 compute-1 kernel: SELinux:  Converting 2731 SID table entries...
Dec 06 09:32:28 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:32:28 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 06 09:32:28 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:32:28 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:32:28 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:32:28 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:32:28 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:32:29 compute-1 groupadd[46006]: group added to /etc/group: name=unbound, GID=993
Dec 06 09:32:29 compute-1 groupadd[46006]: group added to /etc/gshadow: name=unbound
Dec 06 09:32:29 compute-1 groupadd[46006]: new group: name=unbound, GID=993
Dec 06 09:32:29 compute-1 useradd[46013]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Dec 06 09:32:29 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 06 09:32:29 compute-1 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 06 09:32:30 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:32:30 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 06 09:32:30 compute-1 systemd[1]: Reloading.
Dec 06 09:32:30 compute-1 systemd-sysv-generator[46517]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:32:30 compute-1 systemd-rc-local-generator[46513]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:32:31 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 09:32:31 compute-1 sudo[45981]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:31 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 09:32:31 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 06 09:32:31 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.018s CPU time.
Dec 06 09:32:31 compute-1 systemd[1]: run-rc8ed9a019b1e4f9f982ba30e3b926f38.service: Deactivated successfully.
Dec 06 09:32:32 compute-1 sudo[47080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toiqcpysxnmczlhypeiywjcfxyuahuts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013551.9578166-213-133949792243476/AnsiballZ_systemd.py'
Dec 06 09:32:32 compute-1 sudo[47080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:32 compute-1 python3.9[47082]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:32:32 compute-1 systemd[1]: Reloading.
Dec 06 09:32:33 compute-1 systemd-rc-local-generator[47106]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:32:33 compute-1 systemd-sysv-generator[47109]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:32:33 compute-1 systemd[1]: Starting Open vSwitch Database Unit...
Dec 06 09:32:33 compute-1 chown[47124]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 06 09:32:33 compute-1 ovs-ctl[47129]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 06 09:32:33 compute-1 ovs-ctl[47129]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 06 09:32:33 compute-1 ovs-ctl[47129]: Starting ovsdb-server [  OK  ]
Dec 06 09:32:33 compute-1 ovs-vsctl[47178]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 06 09:32:33 compute-1 ovs-vsctl[47198]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"61eba479-a995-4b31-88b9-8ebfcea9907e\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec 06 09:32:33 compute-1 ovs-ctl[47129]: Configuring Open vSwitch system IDs [  OK  ]
Dec 06 09:32:33 compute-1 ovs-ctl[47129]: Enabling remote OVSDB managers [  OK  ]
Dec 06 09:32:33 compute-1 ovs-vsctl[47204]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Dec 06 09:32:33 compute-1 systemd[1]: Started Open vSwitch Database Unit.
Dec 06 09:32:33 compute-1 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 06 09:32:33 compute-1 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 06 09:32:33 compute-1 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 06 09:32:33 compute-1 kernel: openvswitch: Open vSwitch switching datapath
Dec 06 09:32:33 compute-1 ovs-ctl[47248]: Inserting openvswitch module [  OK  ]
Dec 06 09:32:34 compute-1 ovs-ctl[47217]: Starting ovs-vswitchd [  OK  ]
Dec 06 09:32:34 compute-1 ovs-vsctl[47265]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Dec 06 09:32:34 compute-1 ovs-ctl[47217]: Enabling remote OVSDB managers [  OK  ]
Dec 06 09:32:34 compute-1 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 06 09:32:34 compute-1 systemd[1]: Starting Open vSwitch...
Dec 06 09:32:34 compute-1 systemd[1]: Finished Open vSwitch.
Dec 06 09:32:34 compute-1 sudo[47080]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:36 compute-1 python3.9[47417]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:32:36 compute-1 sudo[47567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuwrdyrantucqlhjuapwvawmkqducizb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013556.429569-267-244620140856668/AnsiballZ_sefcontext.py'
Dec 06 09:32:36 compute-1 sudo[47567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:37 compute-1 python3.9[47569]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 06 09:32:38 compute-1 kernel: SELinux:  Converting 2745 SID table entries...
Dec 06 09:32:38 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:32:38 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 06 09:32:38 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:32:38 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:32:38 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:32:38 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:32:38 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:32:38 compute-1 sudo[47567]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:39 compute-1 python3.9[47724]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:32:40 compute-1 sudo[47880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvssgjviwuocdtaeamsvkmnqacgebmlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013560.3804138-321-225955575926336/AnsiballZ_dnf.py'
Dec 06 09:32:40 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 06 09:32:40 compute-1 sudo[47880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:40 compute-1 python3.9[47882]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:32:42 compute-1 sudo[47880]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:43 compute-1 sudo[48033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwmrrvzoyehlpcumcsedzitgkwzvhtbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013562.6684704-345-192351274525175/AnsiballZ_command.py'
Dec 06 09:32:43 compute-1 sudo[48033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:43 compute-1 python3.9[48035]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:32:44 compute-1 sudo[48033]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:45 compute-1 sudo[48320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixhfnbfztokvjqgktnhkslxaepugwhfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013564.5912933-369-100694287457161/AnsiballZ_file.py'
Dec 06 09:32:45 compute-1 sudo[48320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:45 compute-1 python3.9[48322]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 09:32:45 compute-1 sudo[48320]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:46 compute-1 python3.9[48472]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:32:47 compute-1 sudo[48624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxfdofyekbvyfhhzsfjbhnkrpffyfdoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013566.7831385-417-272737992203002/AnsiballZ_dnf.py'
Dec 06 09:32:47 compute-1 sudo[48624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:47 compute-1 python3.9[48626]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:32:49 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:32:49 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 06 09:32:49 compute-1 systemd[1]: Reloading.
Dec 06 09:32:49 compute-1 systemd-rc-local-generator[48658]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:32:49 compute-1 systemd-sysv-generator[48663]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:32:50 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 09:32:50 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 09:32:50 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 06 09:32:50 compute-1 systemd[1]: run-r78981d88ca3449c6b061ca9f5496fc00.service: Deactivated successfully.
Dec 06 09:32:50 compute-1 sudo[48624]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:51 compute-1 sudo[48941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sayvizhnzltlapphccdtzizlallktolp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013570.9202757-441-216874010938393/AnsiballZ_systemd.py'
Dec 06 09:32:51 compute-1 sudo[48941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:51 compute-1 python3.9[48943]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:32:51 compute-1 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 06 09:32:51 compute-1 systemd[1]: Stopped Network Manager Wait Online.
Dec 06 09:32:51 compute-1 systemd[1]: Stopping Network Manager Wait Online...
Dec 06 09:32:51 compute-1 systemd[1]: Stopping Network Manager...
Dec 06 09:32:51 compute-1 NetworkManager[7209]: <info>  [1765013571.6832] caught SIGTERM, shutting down normally.
Dec 06 09:32:51 compute-1 NetworkManager[7209]: <info>  [1765013571.6858] dhcp4 (eth0): canceled DHCP transaction
Dec 06 09:32:51 compute-1 NetworkManager[7209]: <info>  [1765013571.6859] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 06 09:32:51 compute-1 NetworkManager[7209]: <info>  [1765013571.6859] dhcp4 (eth0): state changed no lease
Dec 06 09:32:51 compute-1 NetworkManager[7209]: <info>  [1765013571.6861] manager: NetworkManager state is now CONNECTED_SITE
Dec 06 09:32:51 compute-1 NetworkManager[7209]: <info>  [1765013571.6969] exiting (success)
Dec 06 09:32:51 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 09:32:51 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 09:32:51 compute-1 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 06 09:32:51 compute-1 systemd[1]: Stopped Network Manager.
Dec 06 09:32:51 compute-1 systemd[1]: NetworkManager.service: Consumed 11.988s CPU time, 4.2M memory peak, read 0B from disk, written 27.0K to disk.
Dec 06 09:32:51 compute-1 systemd[1]: Starting Network Manager...
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.7768] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:27715b31-3399-4bbf-a0fa-54836c80918e)
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.7770] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.7822] manager[0x557774aa6090]: monitoring kernel firmware directory '/lib/firmware'.
Dec 06 09:32:51 compute-1 systemd[1]: Starting Hostname Service...
Dec 06 09:32:51 compute-1 systemd[1]: Started Hostname Service.
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.8877] hostname: hostname: using hostnamed
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.8878] hostname: static hostname changed from (none) to "compute-1"
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.8884] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.8890] manager[0x557774aa6090]: rfkill: Wi-Fi hardware radio set enabled
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.8890] manager[0x557774aa6090]: rfkill: WWAN hardware radio set enabled
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.8915] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.8924] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.8925] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.8925] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.8926] manager: Networking is enabled by state file
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.8929] settings: Loaded settings plugin: keyfile (internal)
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.8933] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.8985] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9002] dhcp: init: Using DHCP client 'internal'
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9006] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9015] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9024] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9035] device (lo): Activation: starting connection 'lo' (04d45710-56f6-4696-9924-dd30b84bf74f)
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9045] device (eth0): carrier: link connected
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9050] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9059] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9060] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9069] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9080] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9088] device (eth1): carrier: link connected
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9092] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9101] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (f3fb407f-d9e1-5507-a7f7-856240ad9666) (indicated)
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9102] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9109] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9117] device (eth1): Activation: starting connection 'ci-private-network' (f3fb407f-d9e1-5507-a7f7-856240ad9666)
Dec 06 09:32:51 compute-1 systemd[1]: Started Network Manager.
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9131] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9145] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9148] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9150] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9152] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9155] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9158] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9161] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9166] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9173] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9176] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9185] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9197] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9216] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9218] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9224] device (lo): Activation: successful, device activated.
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9233] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9235] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9238] manager: NetworkManager state is now CONNECTED_LOCAL
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9242] device (eth1): Activation: successful, device activated.
Dec 06 09:32:51 compute-1 systemd[1]: Starting Network Manager Wait Online...
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9448] dhcp4 (eth0): state changed new lease, address=38.102.83.113
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9457] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9536] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9559] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9561] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 06 09:32:51 compute-1 sudo[48941]: pam_unix(sudo:session): session closed for user root
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9565] manager: NetworkManager state is now CONNECTED_SITE
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9569] device (eth0): Activation: successful, device activated.
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9574] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 06 09:32:51 compute-1 NetworkManager[48956]: <info>  [1765013571.9577] manager: startup complete
Dec 06 09:32:51 compute-1 systemd[1]: Finished Network Manager Wait Online.
Dec 06 09:32:52 compute-1 sudo[49167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcdrxegcfvybvqdrmgkthbkivttvvbvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013572.181759-465-4349225263027/AnsiballZ_dnf.py'
Dec 06 09:32:52 compute-1 sudo[49167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:32:52 compute-1 python3.9[49169]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:32:58 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:32:58 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 06 09:32:58 compute-1 systemd[1]: Reloading.
Dec 06 09:32:58 compute-1 systemd-rc-local-generator[49216]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:32:58 compute-1 systemd-sysv-generator[49220]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:32:58 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 09:33:00 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 09:33:00 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 06 09:33:00 compute-1 systemd[1]: run-rcccf55638ed2426f8a5e5075e64f5b0b.service: Deactivated successfully.
Dec 06 09:33:00 compute-1 sudo[49167]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:01 compute-1 sudo[49627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjdmmsgjgnwtyxqthtukqinggznceqyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013580.9103656-501-85070628322322/AnsiballZ_stat.py'
Dec 06 09:33:01 compute-1 sudo[49627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:01 compute-1 python3.9[49629]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:33:01 compute-1 sudo[49627]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:02 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 09:33:02 compute-1 sudo[49780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpqdbyxicbbosmrikmhzapznphhjqhmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013581.7556963-528-254675281018706/AnsiballZ_ini_file.py'
Dec 06 09:33:02 compute-1 sudo[49780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:02 compute-1 python3.9[49782]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:02 compute-1 sudo[49780]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:03 compute-1 sudo[49934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtkbpgaaddpegxeuvwexelyexcckfhnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013582.836826-558-182431473645315/AnsiballZ_ini_file.py'
Dec 06 09:33:03 compute-1 sudo[49934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:03 compute-1 python3.9[49936]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:03 compute-1 sudo[49934]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:03 compute-1 sudo[50086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irlphvkidlnsinniyowvwubypftejmhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013583.5805342-558-256635399908397/AnsiballZ_ini_file.py'
Dec 06 09:33:03 compute-1 sudo[50086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:04 compute-1 python3.9[50088]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:04 compute-1 sudo[50086]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:04 compute-1 sudo[50238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkzdrwwnyucfgoqpkmjemwzayfpwcfwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013584.3711603-603-103062029295082/AnsiballZ_ini_file.py'
Dec 06 09:33:04 compute-1 sudo[50238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:04 compute-1 python3.9[50240]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:04 compute-1 sudo[50238]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:05 compute-1 sudo[50390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlyfwbdhcbpfcrkaigdkmbexxusonykv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013585.08686-603-197264693543748/AnsiballZ_ini_file.py'
Dec 06 09:33:05 compute-1 sudo[50390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:05 compute-1 python3.9[50392]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:05 compute-1 sudo[50390]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:06 compute-1 sudo[50542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jggbxpazlmqpcllkpdsaylciigrdjhvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013585.8883102-648-4839093789921/AnsiballZ_stat.py'
Dec 06 09:33:06 compute-1 sudo[50542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:06 compute-1 python3.9[50544]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:33:06 compute-1 sudo[50542]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:06 compute-1 sudo[50665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkdcpakigrzxewzypwyjcjaajyovxvrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013585.8883102-648-4839093789921/AnsiballZ_copy.py'
Dec 06 09:33:06 compute-1 sudo[50665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:07 compute-1 python3.9[50667]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013585.8883102-648-4839093789921/.source _original_basename=.edt9k9cb follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:07 compute-1 sudo[50665]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:07 compute-1 sudo[50817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzwjfqsrkdvnrejjzbwiqpavbtsnsnwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013587.4462538-693-78954616931284/AnsiballZ_file.py'
Dec 06 09:33:07 compute-1 sudo[50817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:07 compute-1 python3.9[50819]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:07 compute-1 sudo[50817]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:08 compute-1 sudo[50969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axfhjllnuowbjdtvzfatwmoxxdcecevp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013588.2787905-717-36555756993763/AnsiballZ_edpm_os_net_config_mappings.py'
Dec 06 09:33:08 compute-1 sudo[50969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:08 compute-1 python3.9[50971]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 06 09:33:08 compute-1 sudo[50969]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:09 compute-1 sudo[51121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdoxwcovidzugdlgmqtvszzyyqitfzux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013589.2728565-744-83346820012787/AnsiballZ_file.py'
Dec 06 09:33:09 compute-1 sudo[51121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:09 compute-1 python3.9[51123]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:09 compute-1 sudo[51121]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:10 compute-1 sudo[51273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orkoqktesqebeldbygkeriwiadflpgmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013590.3018146-774-134576091231133/AnsiballZ_stat.py'
Dec 06 09:33:10 compute-1 sudo[51273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:10 compute-1 sudo[51273]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:11 compute-1 sudo[51396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqfywmidsvvbjbeqyxslxqxmtdgbwdyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013590.3018146-774-134576091231133/AnsiballZ_copy.py'
Dec 06 09:33:11 compute-1 sudo[51396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:11 compute-1 sudo[51396]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:12 compute-1 sudo[51548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggpzxikllrtnmadfwizsmzsnkzdrvnbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013591.6367824-819-253032466793391/AnsiballZ_slurp.py'
Dec 06 09:33:12 compute-1 sudo[51548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:12 compute-1 python3.9[51550]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 06 09:33:12 compute-1 sudo[51548]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:13 compute-1 sudo[51723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlfbaaytnrpsfewbhpvrrvzfgcgclixr ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013592.6580362-846-268955869646410/async_wrapper.py j948182577153 300 /home/zuul/.ansible/tmp/ansible-tmp-1765013592.6580362-846-268955869646410/AnsiballZ_edpm_os_net_config.py _'
Dec 06 09:33:13 compute-1 sudo[51723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:13 compute-1 ansible-async_wrapper.py[51725]: Invoked with j948182577153 300 /home/zuul/.ansible/tmp/ansible-tmp-1765013592.6580362-846-268955869646410/AnsiballZ_edpm_os_net_config.py _
Dec 06 09:33:13 compute-1 ansible-async_wrapper.py[51728]: Starting module and watcher
Dec 06 09:33:13 compute-1 ansible-async_wrapper.py[51728]: Start watching 51729 (300)
Dec 06 09:33:13 compute-1 ansible-async_wrapper.py[51729]: Start module (51729)
Dec 06 09:33:13 compute-1 ansible-async_wrapper.py[51725]: Return async_wrapper task started.
Dec 06 09:33:13 compute-1 sudo[51723]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:13 compute-1 python3.9[51730]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec 06 09:33:14 compute-1 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 06 09:33:14 compute-1 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 06 09:33:14 compute-1 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec 06 09:33:14 compute-1 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 06 09:33:14 compute-1 kernel: cfg80211: failed to load regulatory.db
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.8688] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.8711] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9320] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9321] audit: op="connection-add" uuid="d45d4ee4-6865-4bc9-8f68-2364ae6474e8" name="br-ex-br" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9343] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9344] audit: op="connection-add" uuid="2332240b-855a-4c20-952f-a49148c1f030" name="br-ex-port" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9366] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9367] audit: op="connection-add" uuid="f72a5f18-8164-4e84-81af-63ac70cda19e" name="eth1-port" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9385] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9387] audit: op="connection-add" uuid="06560179-8f21-4840-89f1-e305670ae13b" name="vlan20-port" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9404] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9406] audit: op="connection-add" uuid="e78382ba-8d43-4538-9f10-314df9dad09b" name="vlan21-port" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9425] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9427] audit: op="connection-add" uuid="cbbe476e-61dc-48d9-a4bc-3925a8944b42" name="vlan22-port" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9445] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9446] audit: op="connection-add" uuid="7516e499-11e3-42d1-a6da-28b443cf8217" name="vlan23-port" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9474] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.dhcp-timeout,ipv6.addr-gen-mode,802-3-ethernet.mtu" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9498] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9499] audit: op="connection-add" uuid="2a8ba0ca-93e0-4e84-96a1-eb2bf4feb098" name="br-ex-if" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9826] audit: op="connection-update" uuid="f3fb407f-d9e1-5507-a7f7-856240ad9666" name="ci-private-network" args="connection.controller,connection.master,connection.slave-type,connection.port-type,connection.timestamp,ovs-external-ids.data,ovs-interface.type,ipv4.method,ipv4.never-default,ipv4.routes,ipv4.dns,ipv4.routing-rules,ipv4.addresses,ipv6.method,ipv6.routes,ipv6.addr-gen-mode,ipv6.dns,ipv6.routing-rules,ipv6.addresses" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9852] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9856] audit: op="connection-add" uuid="8c1c8581-b0c3-4a63-9c77-9fbc5756bf30" name="vlan20-if" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9879] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9881] audit: op="connection-add" uuid="844cc38d-873d-41cd-b14e-0e20f1031e80" name="vlan21-if" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9905] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9906] audit: op="connection-add" uuid="68b52fa8-2c3f-400a-aa45-d937eefe44a1" name="vlan22-if" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9935] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9937] audit: op="connection-add" uuid="4184a6b0-b7b9-414f-8c5e-d7c69e6b028e" name="vlan23-if" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9956] audit: op="connection-delete" uuid="d0a7d597-e5ec-3c93-9ea9-45506a05a0f2" name="Wired connection 1" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9973] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9985] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9992] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (d45d4ee4-6865-4bc9-8f68-2364ae6474e8)
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9993] audit: op="connection-activate" uuid="d45d4ee4-6865-4bc9-8f68-2364ae6474e8" name="br-ex-br" pid=51731 uid=0 result="success"
Dec 06 09:33:15 compute-1 NetworkManager[48956]: <info>  [1765013595.9995] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0004] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0010] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (2332240b-855a-4c20-952f-a49148c1f030)
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0012] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0019] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0025] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (f72a5f18-8164-4e84-81af-63ac70cda19e)
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0026] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0034] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0039] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (06560179-8f21-4840-89f1-e305670ae13b)
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0041] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0048] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0053] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (e78382ba-8d43-4538-9f10-314df9dad09b)
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0055] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0063] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0068] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (cbbe476e-61dc-48d9-a4bc-3925a8944b42)
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0070] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0079] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0085] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (7516e499-11e3-42d1-a6da-28b443cf8217)
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0085] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0088] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0090] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0100] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0105] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0110] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (2a8ba0ca-93e0-4e84-96a1-eb2bf4feb098)
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0110] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0115] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0117] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0118] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0119] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0134] device (eth1): disconnecting for new activation request.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0134] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0137] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0139] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0140] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0146] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0152] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0157] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (8c1c8581-b0c3-4a63-9c77-9fbc5756bf30)
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0158] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0161] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0164] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0165] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0169] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0174] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0179] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (844cc38d-873d-41cd-b14e-0e20f1031e80)
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0180] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0183] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0184] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0185] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0189] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0232] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0239] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (68b52fa8-2c3f-400a-aa45-d937eefe44a1)
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0240] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0244] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0246] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0247] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0250] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0256] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0263] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (4184a6b0-b7b9-414f-8c5e-d7c69e6b028e)
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0264] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0267] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0270] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0272] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0274] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0297] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode,802-3-ethernet.mtu" pid=51731 uid=0 result="success"
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0300] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0304] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0305] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0313] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0317] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0321] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0324] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0325] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0330] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0334] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0338] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0339] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0344] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0347] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0349] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0350] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 kernel: ovs-system: entered promiscuous mode
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0358] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 systemd-udevd[51735]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 09:33:16 compute-1 kernel: Timeout policy base is empty
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0400] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0409] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0414] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0429] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0439] dhcp4 (eth0): canceled DHCP transaction
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0440] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0440] dhcp4 (eth0): state changed no lease
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0444] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0464] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0471] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51731 uid=0 result="fail" reason="Device is not activated"
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0486] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0496] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0508] device (eth1): disconnecting for new activation request.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0509] audit: op="connection-activate" uuid="f3fb407f-d9e1-5507-a7f7-856240ad9666" name="ci-private-network" pid=51731 uid=0 result="success"
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0517] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0525] dhcp4 (eth0): state changed new lease, address=38.102.83.113
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0534] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Dec 06 09:33:16 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0612] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0734] device (eth1): Activation: starting connection 'ci-private-network' (f3fb407f-d9e1-5507-a7f7-856240ad9666)
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0743] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51731 uid=0 result="success"
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0744] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0747] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0749] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0753] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0757] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0771] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0782] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0795] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0809] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0823] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0833] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0839] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0847] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0854] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0862] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0869] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0877] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0884] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0887] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0892] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0902] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0909] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0916] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 kernel: br-ex: entered promiscuous mode
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0923] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0933] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.0940] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec 06 09:33:16 compute-1 kernel: vlan22: entered promiscuous mode
Dec 06 09:33:16 compute-1 systemd-udevd[51737]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1077] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1084] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1088] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1096] device (eth1): Activation: successful, device activated.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1123] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1139] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1142] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1146] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1187] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec 06 09:33:16 compute-1 kernel: vlan21: entered promiscuous mode
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1236] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1273] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 kernel: vlan20: entered promiscuous mode
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1283] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1292] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1313] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1324] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 kernel: vlan23: entered promiscuous mode
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1532] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1539] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1541] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1553] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1559] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1589] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1594] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1786] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1792] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1794] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1803] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1807] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 06 09:33:16 compute-1 NetworkManager[48956]: <info>  [1765013596.1811] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 06 09:33:17 compute-1 sudo[52086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuhaiclemntcuxmidktcepqgeuvxecgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013596.82725-846-54963367194876/AnsiballZ_async_status.py'
Dec 06 09:33:17 compute-1 sudo[52086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:17 compute-1 NetworkManager[48956]: <info>  [1765013597.3406] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51731 uid=0 result="success"
Dec 06 09:33:17 compute-1 python3.9[52088]: ansible-ansible.legacy.async_status Invoked with jid=j948182577153.51725 mode=status _async_dir=/root/.ansible_async
Dec 06 09:33:17 compute-1 sudo[52086]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:17 compute-1 NetworkManager[48956]: <info>  [1765013597.5249] checkpoint[0x557774a7c950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec 06 09:33:17 compute-1 NetworkManager[48956]: <info>  [1765013597.5251] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51731 uid=0 result="success"
Dec 06 09:33:17 compute-1 NetworkManager[48956]: <info>  [1765013597.8427] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51731 uid=0 result="success"
Dec 06 09:33:17 compute-1 NetworkManager[48956]: <info>  [1765013597.8445] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51731 uid=0 result="success"
Dec 06 09:33:18 compute-1 NetworkManager[48956]: <info>  [1765013598.1084] audit: op="networking-control" arg="global-dns-configuration" pid=51731 uid=0 result="success"
Dec 06 09:33:18 compute-1 NetworkManager[48956]: <info>  [1765013598.1122] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec 06 09:33:18 compute-1 NetworkManager[48956]: <info>  [1765013598.1159] audit: op="networking-control" arg="global-dns-configuration" pid=51731 uid=0 result="success"
Dec 06 09:33:18 compute-1 NetworkManager[48956]: <info>  [1765013598.1200] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51731 uid=0 result="success"
Dec 06 09:33:18 compute-1 NetworkManager[48956]: <info>  [1765013598.2718] checkpoint[0x557774a7ca20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec 06 09:33:18 compute-1 NetworkManager[48956]: <info>  [1765013598.2723] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51731 uid=0 result="success"
Dec 06 09:33:18 compute-1 ansible-async_wrapper.py[51729]: Module complete (51729)
Dec 06 09:33:18 compute-1 ansible-async_wrapper.py[51728]: Done in kid B.
Dec 06 09:33:20 compute-1 sudo[52192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auopeaujgksbbfahkouncqisstuohhup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013596.82725-846-54963367194876/AnsiballZ_async_status.py'
Dec 06 09:33:20 compute-1 sudo[52192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:21 compute-1 python3.9[52194]: ansible-ansible.legacy.async_status Invoked with jid=j948182577153.51725 mode=status _async_dir=/root/.ansible_async
Dec 06 09:33:21 compute-1 sudo[52192]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:21 compute-1 sudo[52292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlbslrtwsjglypnikivedjciesgdbfgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013596.82725-846-54963367194876/AnsiballZ_async_status.py'
Dec 06 09:33:21 compute-1 sudo[52292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:21 compute-1 python3.9[52294]: ansible-ansible.legacy.async_status Invoked with jid=j948182577153.51725 mode=cleanup _async_dir=/root/.ansible_async
Dec 06 09:33:21 compute-1 sudo[52292]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:21 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 06 09:33:22 compute-1 sudo[52446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihqtdpzpwgbplpwlielywftetpqvlyde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013602.005343-927-212777252682982/AnsiballZ_stat.py'
Dec 06 09:33:22 compute-1 sudo[52446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:22 compute-1 python3.9[52448]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:33:22 compute-1 sudo[52446]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:22 compute-1 sudo[52569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbqdmfekewxlffpcmobghgbqxlpucgbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013602.005343-927-212777252682982/AnsiballZ_copy.py'
Dec 06 09:33:22 compute-1 sudo[52569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:23 compute-1 python3.9[52571]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013602.005343-927-212777252682982/.source.returncode _original_basename=.aitz6_rl follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:23 compute-1 sudo[52569]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:23 compute-1 sudo[52721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlbetvusuurxoicxyopzcslsuczlmpsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013603.6428676-975-269970283608371/AnsiballZ_stat.py'
Dec 06 09:33:23 compute-1 sudo[52721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:24 compute-1 python3.9[52723]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:33:24 compute-1 sudo[52721]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:24 compute-1 sudo[52845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsotqdarsodprhxwinyuhhfnurojysgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013603.6428676-975-269970283608371/AnsiballZ_copy.py'
Dec 06 09:33:24 compute-1 sudo[52845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:24 compute-1 python3.9[52847]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013603.6428676-975-269970283608371/.source.cfg _original_basename=.v7quqsaw follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:24 compute-1 sudo[52845]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:25 compute-1 sudo[52997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-losfqjyxsahqjndqnxafqahztivavvla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013605.2175224-1020-106739338638054/AnsiballZ_systemd.py'
Dec 06 09:33:25 compute-1 sudo[52997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:25 compute-1 python3.9[52999]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:33:25 compute-1 systemd[1]: Reloading Network Manager...
Dec 06 09:33:25 compute-1 NetworkManager[48956]: <info>  [1765013605.9513] audit: op="reload" arg="0" pid=53003 uid=0 result="success"
Dec 06 09:33:25 compute-1 NetworkManager[48956]: <info>  [1765013605.9526] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec 06 09:33:26 compute-1 systemd[1]: Reloaded Network Manager.
Dec 06 09:33:26 compute-1 sudo[52997]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:26 compute-1 sshd-session[44958]: Connection closed by 192.168.122.30 port 34486
Dec 06 09:33:26 compute-1 sshd-session[44955]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:33:26 compute-1 systemd[1]: session-11.scope: Deactivated successfully.
Dec 06 09:33:26 compute-1 systemd[1]: session-11.scope: Consumed 59.382s CPU time.
Dec 06 09:33:26 compute-1 systemd-logind[788]: Session 11 logged out. Waiting for processes to exit.
Dec 06 09:33:26 compute-1 systemd-logind[788]: Removed session 11.
Dec 06 09:33:32 compute-1 sshd-session[53034]: Accepted publickey for zuul from 192.168.122.30 port 52270 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:33:32 compute-1 systemd-logind[788]: New session 12 of user zuul.
Dec 06 09:33:32 compute-1 systemd[1]: Started Session 12 of User zuul.
Dec 06 09:33:32 compute-1 sshd-session[53034]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:33:33 compute-1 python3.9[53187]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:33:34 compute-1 python3.9[53342]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:33:36 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 06 09:33:37 compute-1 python3.9[53537]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:33:37 compute-1 sshd-session[53037]: Connection closed by 192.168.122.30 port 52270
Dec 06 09:33:37 compute-1 sshd-session[53034]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:33:37 compute-1 systemd[1]: session-12.scope: Deactivated successfully.
Dec 06 09:33:37 compute-1 systemd[1]: session-12.scope: Consumed 2.570s CPU time.
Dec 06 09:33:37 compute-1 systemd-logind[788]: Session 12 logged out. Waiting for processes to exit.
Dec 06 09:33:37 compute-1 systemd-logind[788]: Removed session 12.
Dec 06 09:33:43 compute-1 sshd-session[53566]: Accepted publickey for zuul from 192.168.122.30 port 60284 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:33:43 compute-1 systemd-logind[788]: New session 13 of user zuul.
Dec 06 09:33:43 compute-1 systemd[1]: Started Session 13 of User zuul.
Dec 06 09:33:43 compute-1 sshd-session[53566]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:33:44 compute-1 python3.9[53719]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:33:46 compute-1 python3.9[53873]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:33:47 compute-1 sudo[54028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irlqseyeomokwxdplsbpdtphbikmaldf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013626.672045-81-90795769285619/AnsiballZ_setup.py'
Dec 06 09:33:47 compute-1 sudo[54028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:47 compute-1 python3.9[54030]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:33:47 compute-1 sudo[54028]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:48 compute-1 sudo[54112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdtouwluoxkoswmbujwqyslnfmraevbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013626.672045-81-90795769285619/AnsiballZ_dnf.py'
Dec 06 09:33:48 compute-1 sudo[54112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:48 compute-1 python3.9[54114]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:33:49 compute-1 sudo[54112]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:50 compute-1 sudo[54265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxqnfaahxmjhehzbwigpmqtvagmicfrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013629.9092178-117-71664920161595/AnsiballZ_setup.py'
Dec 06 09:33:50 compute-1 sudo[54265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:50 compute-1 python3.9[54267]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:33:50 compute-1 sudo[54265]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:51 compute-1 sudo[54460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbnlqxjiqmmdcgiqscadmtcxhzlzoeql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013631.3707283-150-114492863801862/AnsiballZ_file.py'
Dec 06 09:33:51 compute-1 sudo[54460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:52 compute-1 python3.9[54462]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:52 compute-1 sudo[54460]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:52 compute-1 sudo[54612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xelpaapzzhveooqlbhecnlonmcitcbcp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013632.311488-174-160383618231550/AnsiballZ_command.py'
Dec 06 09:33:52 compute-1 sudo[54612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:53 compute-1 python3.9[54614]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:33:53 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat2171939770-merged.mount: Deactivated successfully.
Dec 06 09:33:53 compute-1 podman[54615]: 2025-12-06 09:33:53.190277393 +0000 UTC m=+0.080201695 system refresh
Dec 06 09:33:53 compute-1 sudo[54612]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:53 compute-1 sudo[54776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyluwlilajpspmwruwsvktztnpnwdmpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013633.5129702-198-186887960172822/AnsiballZ_stat.py'
Dec 06 09:33:53 compute-1 sudo[54776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:54 compute-1 python3.9[54778]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:33:54 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:33:54 compute-1 sudo[54776]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:54 compute-1 sudo[54899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdavcyrpgbctblpyxbdcvvpwiaqusrwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013633.5129702-198-186887960172822/AnsiballZ_copy.py'
Dec 06 09:33:54 compute-1 sudo[54899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:55 compute-1 python3.9[54901]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013633.5129702-198-186887960172822/.source.json follow=False _original_basename=podman_network_config.j2 checksum=9e9c5ca233623e32c18f7aced1026064b2947e96 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:33:55 compute-1 sudo[54899]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:55 compute-1 sudo[55051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqeonrwnzzctocoliwamuozfequdonyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013635.2833784-243-215436565585376/AnsiballZ_stat.py'
Dec 06 09:33:55 compute-1 sudo[55051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:55 compute-1 python3.9[55053]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:33:55 compute-1 sudo[55051]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:56 compute-1 sudo[55174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsiufbqjdmahtyaxsukxcfwjinowqjjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013635.2833784-243-215436565585376/AnsiballZ_copy.py'
Dec 06 09:33:56 compute-1 sudo[55174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:56 compute-1 python3.9[55176]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013635.2833784-243-215436565585376/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:33:56 compute-1 sudo[55174]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:57 compute-1 sudo[55326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scudzapkfwlkfsikfjsjscynlitmtqbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013636.8631172-291-179149145681374/AnsiballZ_ini_file.py'
Dec 06 09:33:57 compute-1 sudo[55326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:57 compute-1 python3.9[55328]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:33:57 compute-1 sudo[55326]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:58 compute-1 sudo[55478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbgcmlhnftoerjgyzxdxgazwklaonkrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013637.680561-291-113312667343262/AnsiballZ_ini_file.py'
Dec 06 09:33:58 compute-1 sudo[55478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:58 compute-1 python3.9[55480]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:33:58 compute-1 sudo[55478]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:58 compute-1 sudo[55630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iijcgtehuptzqppyfxojezbaglqoohbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013638.394281-291-250397591757195/AnsiballZ_ini_file.py'
Dec 06 09:33:58 compute-1 sudo[55630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:58 compute-1 python3.9[55632]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:33:58 compute-1 sudo[55630]: pam_unix(sudo:session): session closed for user root
Dec 06 09:33:59 compute-1 sudo[55782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvpbvykohopwdvwjblxbhepgbascijgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013639.113375-291-79363427324677/AnsiballZ_ini_file.py'
Dec 06 09:33:59 compute-1 sudo[55782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:33:59 compute-1 python3.9[55784]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:33:59 compute-1 sudo[55782]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:00 compute-1 sudo[55934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edkauzvkxeybayknfaenxbwxqhyaemwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013640.1134026-384-45614598764816/AnsiballZ_dnf.py'
Dec 06 09:34:00 compute-1 sudo[55934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:00 compute-1 python3.9[55936]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:34:02 compute-1 sudo[55934]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:03 compute-1 sudo[56087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uejfptmkwbkykcgavxurpyxqjahzqtho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013642.9163592-417-91445428678381/AnsiballZ_setup.py'
Dec 06 09:34:03 compute-1 sudo[56087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:03 compute-1 python3.9[56089]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:34:03 compute-1 sudo[56087]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:04 compute-1 sudo[56241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afqwssajoxbmvggphmbrcmpptabjyepk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013643.903144-441-76917673067082/AnsiballZ_stat.py'
Dec 06 09:34:04 compute-1 sudo[56241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:04 compute-1 python3.9[56243]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:34:04 compute-1 sudo[56241]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:05 compute-1 sudo[56393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sovmqjkggxwlzksuydwjetkvxonyeuui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013644.7959177-468-123781849571698/AnsiballZ_stat.py'
Dec 06 09:34:05 compute-1 sudo[56393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:05 compute-1 python3.9[56395]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:34:05 compute-1 sudo[56393]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:05 compute-1 sudo[56545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tspitmykrapcpoaotwillcgkesiqyjmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013645.6858914-498-254416703667001/AnsiballZ_command.py'
Dec 06 09:34:05 compute-1 sudo[56545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:06 compute-1 python3.9[56547]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:34:06 compute-1 sudo[56545]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:07 compute-1 sudo[56698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfskqtyutjetqlysuwjyvygizivwjmhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013646.6057773-528-82076731673896/AnsiballZ_service_facts.py'
Dec 06 09:34:07 compute-1 sudo[56698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:07 compute-1 python3.9[56700]: ansible-service_facts Invoked
Dec 06 09:34:07 compute-1 network[56717]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:34:07 compute-1 network[56718]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:34:07 compute-1 network[56719]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:34:11 compute-1 sudo[56698]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:13 compute-1 sudo[57002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjfbvwnhpvfcrsnmzazeiepyhkbokixt ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1765013653.2500298-573-234270169565643/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1765013653.2500298-573-234270169565643/args'
Dec 06 09:34:13 compute-1 sudo[57002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:13 compute-1 sudo[57002]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:14 compute-1 sudo[57169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbdkpboxotkpwdgcqjcdzksaqtfmubhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013654.3580534-606-33663779389199/AnsiballZ_dnf.py'
Dec 06 09:34:14 compute-1 sudo[57169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:14 compute-1 python3.9[57171]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:34:16 compute-1 sudo[57169]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:17 compute-1 sudo[57322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-admqhoaylygzoonbmfklshzgxydrhszm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013657.0790157-645-257497263758075/AnsiballZ_package_facts.py'
Dec 06 09:34:17 compute-1 sudo[57322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:18 compute-1 python3.9[57324]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 06 09:34:18 compute-1 sudo[57322]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:19 compute-1 sudo[57474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozhbondknfbwqmjhqwjcfwzipkafwfqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013659.1046736-676-27212550812926/AnsiballZ_stat.py'
Dec 06 09:34:19 compute-1 sudo[57474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:19 compute-1 python3.9[57476]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:19 compute-1 sudo[57474]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:20 compute-1 sudo[57599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckhtnoyksvlcbnmlsjcrtbsmjcqnmhni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013659.1046736-676-27212550812926/AnsiballZ_copy.py'
Dec 06 09:34:20 compute-1 sudo[57599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:20 compute-1 python3.9[57601]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013659.1046736-676-27212550812926/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:20 compute-1 sudo[57599]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:21 compute-1 sudo[57753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbdxvclnufkodkjvxcxpnsonmstufjxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013660.749061-721-146435706367135/AnsiballZ_stat.py'
Dec 06 09:34:21 compute-1 sudo[57753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:21 compute-1 python3.9[57755]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:21 compute-1 sudo[57753]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:21 compute-1 sudo[57878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipsabhqpgamsrgrnzaomvasmmjuooxsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013660.749061-721-146435706367135/AnsiballZ_copy.py'
Dec 06 09:34:21 compute-1 sudo[57878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:22 compute-1 python3.9[57880]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013660.749061-721-146435706367135/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:22 compute-1 sudo[57878]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:23 compute-1 sudo[58032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruemmmbezlockvquouazsqauczsvzpea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013663.1616383-784-98517500865961/AnsiballZ_lineinfile.py'
Dec 06 09:34:23 compute-1 sudo[58032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:23 compute-1 python3.9[58034]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:23 compute-1 sudo[58032]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:25 compute-1 sudo[58186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zahcitodegixxxmqbttdghnkyepovsqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013664.945296-828-231841779846215/AnsiballZ_setup.py'
Dec 06 09:34:25 compute-1 sudo[58186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:25 compute-1 python3.9[58188]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:34:25 compute-1 sudo[58186]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:26 compute-1 sudo[58270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iumwjsrfwtcixyfqoorvlkzjwdsxrkmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013664.945296-828-231841779846215/AnsiballZ_systemd.py'
Dec 06 09:34:26 compute-1 sudo[58270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:26 compute-1 python3.9[58272]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:34:26 compute-1 sudo[58270]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:27 compute-1 sudo[58424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmpkzdoljkixzryecgymuayrwnimqpqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013667.6071174-876-270379646152778/AnsiballZ_setup.py'
Dec 06 09:34:27 compute-1 sudo[58424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:28 compute-1 python3.9[58426]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:34:28 compute-1 sudo[58424]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:28 compute-1 sudo[58508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfdibuapmmzlylvzhxjzgawkdthmjzge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013667.6071174-876-270379646152778/AnsiballZ_systemd.py'
Dec 06 09:34:28 compute-1 sudo[58508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:29 compute-1 python3.9[58510]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:34:29 compute-1 chronyd[796]: chronyd exiting
Dec 06 09:34:29 compute-1 systemd[1]: Stopping NTP client/server...
Dec 06 09:34:29 compute-1 systemd[1]: chronyd.service: Deactivated successfully.
Dec 06 09:34:29 compute-1 systemd[1]: Stopped NTP client/server.
Dec 06 09:34:29 compute-1 systemd[1]: Starting NTP client/server...
Dec 06 09:34:29 compute-1 chronyd[58518]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 06 09:34:29 compute-1 chronyd[58518]: Frequency -23.235 +/- 0.721 ppm read from /var/lib/chrony/drift
Dec 06 09:34:29 compute-1 chronyd[58518]: Loaded seccomp filter (level 2)
Dec 06 09:34:29 compute-1 systemd[1]: Started NTP client/server.
Dec 06 09:34:29 compute-1 sudo[58508]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:30 compute-1 sshd-session[53569]: Connection closed by 192.168.122.30 port 60284
Dec 06 09:34:30 compute-1 sshd-session[53566]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:34:30 compute-1 systemd[1]: session-13.scope: Deactivated successfully.
Dec 06 09:34:30 compute-1 systemd[1]: session-13.scope: Consumed 28.919s CPU time.
Dec 06 09:34:30 compute-1 systemd-logind[788]: Session 13 logged out. Waiting for processes to exit.
Dec 06 09:34:30 compute-1 systemd-logind[788]: Removed session 13.
Dec 06 09:34:36 compute-1 sshd-session[58544]: Accepted publickey for zuul from 192.168.122.30 port 44486 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:34:36 compute-1 systemd-logind[788]: New session 14 of user zuul.
Dec 06 09:34:36 compute-1 systemd[1]: Started Session 14 of User zuul.
Dec 06 09:34:36 compute-1 sshd-session[58544]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:34:36 compute-1 sudo[58697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvegovkadiequaqyqvwejohljxglylpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013676.2978988-27-141873964849740/AnsiballZ_file.py'
Dec 06 09:34:36 compute-1 sudo[58697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:37 compute-1 python3.9[58699]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:37 compute-1 sudo[58697]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:37 compute-1 sudo[58849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heofcemoxzegnxxozphyrmcqcanzdcfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013677.2775483-63-73203465115698/AnsiballZ_stat.py'
Dec 06 09:34:37 compute-1 sudo[58849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:37 compute-1 python3.9[58851]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:37 compute-1 sudo[58849]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:38 compute-1 sudo[58972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shgqamxxzpuqsxrzieqimmqzigfgrioq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013677.2775483-63-73203465115698/AnsiballZ_copy.py'
Dec 06 09:34:38 compute-1 sudo[58972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:38 compute-1 python3.9[58974]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013677.2775483-63-73203465115698/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:38 compute-1 sudo[58972]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:39 compute-1 sshd-session[58547]: Connection closed by 192.168.122.30 port 44486
Dec 06 09:34:39 compute-1 sshd-session[58544]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:34:39 compute-1 systemd[1]: session-14.scope: Deactivated successfully.
Dec 06 09:34:39 compute-1 systemd[1]: session-14.scope: Consumed 1.855s CPU time.
Dec 06 09:34:39 compute-1 systemd-logind[788]: Session 14 logged out. Waiting for processes to exit.
Dec 06 09:34:39 compute-1 systemd-logind[788]: Removed session 14.
Dec 06 09:34:45 compute-1 sshd-session[58999]: Accepted publickey for zuul from 192.168.122.30 port 59064 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:34:45 compute-1 systemd-logind[788]: New session 15 of user zuul.
Dec 06 09:34:45 compute-1 systemd[1]: Started Session 15 of User zuul.
Dec 06 09:34:45 compute-1 sshd-session[58999]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:34:46 compute-1 python3.9[59152]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:34:48 compute-1 sudo[59306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkkkxghmuessxpiycpcfblgyftrwxizt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013687.5474572-60-95618572986046/AnsiballZ_file.py'
Dec 06 09:34:48 compute-1 sudo[59306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:48 compute-1 python3.9[59308]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:48 compute-1 sudo[59306]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:49 compute-1 sudo[59481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewizzfumkircbksqrveqhrtgqpzyfjcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013688.536725-84-53368865078967/AnsiballZ_stat.py'
Dec 06 09:34:49 compute-1 sudo[59481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:49 compute-1 python3.9[59483]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:49 compute-1 sudo[59481]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:49 compute-1 sudo[59604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndikccamanpwbbukhbhxhzjiqbwrhmoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013688.536725-84-53368865078967/AnsiballZ_copy.py'
Dec 06 09:34:49 compute-1 sudo[59604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:49 compute-1 python3.9[59606]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1765013688.536725-84-53368865078967/.source.json _original_basename=.qnh66egr follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:49 compute-1 sudo[59604]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:50 compute-1 sudo[59756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jubfnabpaaktgfcvtlsyygnhmoriphci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013690.6593707-153-263416561976080/AnsiballZ_stat.py'
Dec 06 09:34:50 compute-1 sudo[59756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:51 compute-1 python3.9[59758]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:51 compute-1 sudo[59756]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:51 compute-1 sudo[59879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxoxsdkzskoqswjvqfzufcqmypjraxmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013690.6593707-153-263416561976080/AnsiballZ_copy.py'
Dec 06 09:34:51 compute-1 sudo[59879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:51 compute-1 python3.9[59881]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013690.6593707-153-263416561976080/.source _original_basename=.79vzxolj follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:51 compute-1 sudo[59879]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:52 compute-1 sudo[60031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwroapsfbuzhxtlkpyhtqycrbchwlesj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013692.039393-201-67348302634729/AnsiballZ_file.py'
Dec 06 09:34:52 compute-1 sudo[60031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:52 compute-1 python3.9[60033]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:52 compute-1 sudo[60031]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:53 compute-1 sudo[60183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqmruaomkbgnakmwggwwxkxfjdmjprsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013692.8972878-225-279198583258885/AnsiballZ_stat.py'
Dec 06 09:34:53 compute-1 sudo[60183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:53 compute-1 python3.9[60185]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:53 compute-1 sudo[60183]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:53 compute-1 sudo[60306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymrbihsifhlbdngzvlwvldwpkvmubrfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013692.8972878-225-279198583258885/AnsiballZ_copy.py'
Dec 06 09:34:53 compute-1 sudo[60306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:54 compute-1 python3.9[60308]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013692.8972878-225-279198583258885/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:54 compute-1 sudo[60306]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:54 compute-1 sudo[60458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtzmfkxkcvvszoysthskxjrcsfloiwbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013694.2990668-225-163750403702538/AnsiballZ_stat.py'
Dec 06 09:34:54 compute-1 sudo[60458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:54 compute-1 python3.9[60460]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:54 compute-1 sudo[60458]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:55 compute-1 sudo[60581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xalrchzrdhtesaqvgijeninfsybmhcdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013694.2990668-225-163750403702538/AnsiballZ_copy.py'
Dec 06 09:34:55 compute-1 sudo[60581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:55 compute-1 python3.9[60583]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765013694.2990668-225-163750403702538/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:34:55 compute-1 sudo[60581]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:56 compute-1 sudo[60733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjqnnyghuoljedbkslifssllkvbsurki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013696.1013484-312-206457529668990/AnsiballZ_file.py'
Dec 06 09:34:56 compute-1 sudo[60733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:56 compute-1 python3.9[60735]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:56 compute-1 sudo[60733]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:57 compute-1 sudo[60885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikedqgsayyvykwqvyovevusrktqngelo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013696.9059095-336-90911090076190/AnsiballZ_stat.py'
Dec 06 09:34:57 compute-1 sudo[60885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:57 compute-1 python3.9[60887]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:57 compute-1 sudo[60885]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:58 compute-1 sudo[61008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcxbpbdxrfydkpowpimokyekbboevjuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013696.9059095-336-90911090076190/AnsiballZ_copy.py'
Dec 06 09:34:58 compute-1 sudo[61008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:58 compute-1 python3.9[61010]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013696.9059095-336-90911090076190/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:58 compute-1 sudo[61008]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:58 compute-1 sudo[61160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqznofqeqqtpjcmegndbphjsjxeeryhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013698.5931203-381-208996469997144/AnsiballZ_stat.py'
Dec 06 09:34:58 compute-1 sudo[61160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:59 compute-1 python3.9[61162]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:34:59 compute-1 sudo[61160]: pam_unix(sudo:session): session closed for user root
Dec 06 09:34:59 compute-1 sudo[61283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykargvyiowgrizgyvvfodpcghlpimyez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013698.5931203-381-208996469997144/AnsiballZ_copy.py'
Dec 06 09:34:59 compute-1 sudo[61283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:34:59 compute-1 python3.9[61285]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013698.5931203-381-208996469997144/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:34:59 compute-1 sudo[61283]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:00 compute-1 sudo[61435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khxhotghjclndkgngtktbnsbluhsevwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013700.168006-426-92481262098321/AnsiballZ_systemd.py'
Dec 06 09:35:00 compute-1 sudo[61435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:01 compute-1 python3.9[61437]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:35:01 compute-1 systemd[1]: Reloading.
Dec 06 09:35:01 compute-1 systemd-rc-local-generator[61461]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:01 compute-1 systemd-sysv-generator[61469]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:01 compute-1 systemd[1]: Reloading.
Dec 06 09:35:01 compute-1 systemd-rc-local-generator[61493]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:01 compute-1 systemd-sysv-generator[61502]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:01 compute-1 systemd[1]: Starting EDPM Container Shutdown...
Dec 06 09:35:01 compute-1 systemd[1]: Finished EDPM Container Shutdown.
Dec 06 09:35:01 compute-1 sudo[61435]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:02 compute-1 sudo[61663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmwjffijyjqwnbdhzfddlwvvrbhvwnyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013702.0467942-450-212471631730688/AnsiballZ_stat.py'
Dec 06 09:35:02 compute-1 sudo[61663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:02 compute-1 python3.9[61665]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:02 compute-1 sudo[61663]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:02 compute-1 sudo[61786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kriqgxklsmbqkpigtjmedfrsxoriumve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013702.0467942-450-212471631730688/AnsiballZ_copy.py'
Dec 06 09:35:02 compute-1 sudo[61786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:03 compute-1 python3.9[61788]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013702.0467942-450-212471631730688/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:03 compute-1 sudo[61786]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:03 compute-1 sudo[61938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpzzncmwvblsscuujowlvfkxcnorhcoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013703.4939406-495-6268607295447/AnsiballZ_stat.py'
Dec 06 09:35:03 compute-1 sudo[61938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:04 compute-1 python3.9[61940]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:04 compute-1 sudo[61938]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:04 compute-1 sudo[62061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iatvexqpdhamqdouielklcauzzrfjuov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013703.4939406-495-6268607295447/AnsiballZ_copy.py'
Dec 06 09:35:04 compute-1 sudo[62061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:04 compute-1 python3.9[62063]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013703.4939406-495-6268607295447/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:04 compute-1 sudo[62061]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:05 compute-1 sudo[62213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqxxyhjesbpargnasvkqhuswwrknsjzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013705.0030487-540-83611737305699/AnsiballZ_systemd.py'
Dec 06 09:35:05 compute-1 sudo[62213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:05 compute-1 python3.9[62215]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:35:05 compute-1 systemd[1]: Reloading.
Dec 06 09:35:05 compute-1 systemd-rc-local-generator[62241]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:05 compute-1 systemd-sysv-generator[62247]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:06 compute-1 systemd[1]: Reloading.
Dec 06 09:35:06 compute-1 systemd-rc-local-generator[62278]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:06 compute-1 systemd-sysv-generator[62282]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:06 compute-1 systemd[1]: Starting Create netns directory...
Dec 06 09:35:06 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:35:06 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:35:06 compute-1 systemd[1]: Finished Create netns directory.
Dec 06 09:35:06 compute-1 sudo[62213]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:08 compute-1 python3.9[62441]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:35:09 compute-1 network[62458]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:35:09 compute-1 network[62459]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:35:09 compute-1 network[62460]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:35:13 compute-1 sudo[62720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkofnljdamuyznhwtwvelchypmjcardu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013712.8984225-588-258749827809889/AnsiballZ_systemd.py'
Dec 06 09:35:13 compute-1 sudo[62720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:13 compute-1 python3.9[62722]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:35:13 compute-1 systemd[1]: Reloading.
Dec 06 09:35:13 compute-1 systemd-rc-local-generator[62747]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:13 compute-1 systemd-sysv-generator[62754]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:13 compute-1 systemd[1]: Stopping IPv4 firewall with iptables...
Dec 06 09:35:14 compute-1 iptables.init[62762]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec 06 09:35:14 compute-1 iptables.init[62762]: iptables: Flushing firewall rules: [  OK  ]
Dec 06 09:35:14 compute-1 systemd[1]: iptables.service: Deactivated successfully.
Dec 06 09:35:14 compute-1 systemd[1]: Stopped IPv4 firewall with iptables.
Dec 06 09:35:14 compute-1 sudo[62720]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:14 compute-1 sudo[62957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbspxrokgsojfxrgyqaimqmadsbbrwhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013714.3310268-588-195828582164406/AnsiballZ_systemd.py'
Dec 06 09:35:14 compute-1 sudo[62957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:14 compute-1 python3.9[62959]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:35:14 compute-1 sudo[62957]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:16 compute-1 sudo[63111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkctaxbxbjdzixtycjpwfqymaizkyxyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013716.6622198-636-192821666881634/AnsiballZ_systemd.py'
Dec 06 09:35:16 compute-1 sudo[63111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:17 compute-1 python3.9[63113]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:35:17 compute-1 systemd[1]: Reloading.
Dec 06 09:35:17 compute-1 systemd-rc-local-generator[63143]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:35:17 compute-1 systemd-sysv-generator[63147]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:35:17 compute-1 systemd[1]: Starting Netfilter Tables...
Dec 06 09:35:17 compute-1 systemd[1]: Finished Netfilter Tables.
Dec 06 09:35:17 compute-1 sudo[63111]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:18 compute-1 sudo[63304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jykgfxhsdzzeqfaysomzkbyhhdkqpfml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013718.0540724-660-265210627288600/AnsiballZ_command.py'
Dec 06 09:35:18 compute-1 sudo[63304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:18 compute-1 python3.9[63306]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:18 compute-1 sudo[63304]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:19 compute-1 sudo[63457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkomqubhjfruomryjjiuealvkktwxkov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013719.4834769-702-267801939867960/AnsiballZ_stat.py'
Dec 06 09:35:19 compute-1 sudo[63457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:19 compute-1 python3.9[63459]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:20 compute-1 sudo[63457]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:20 compute-1 sudo[63582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djrlwlehirlrlmykopnuezqyutgyqiej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013719.4834769-702-267801939867960/AnsiballZ_copy.py'
Dec 06 09:35:20 compute-1 sudo[63582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:20 compute-1 python3.9[63584]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013719.4834769-702-267801939867960/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:20 compute-1 sudo[63582]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:21 compute-1 sudo[63735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngptwisgwzherhejzgkpalumjumbdffz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013720.9686754-747-207294296046308/AnsiballZ_systemd.py'
Dec 06 09:35:21 compute-1 sudo[63735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:21 compute-1 python3.9[63737]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:35:21 compute-1 systemd[1]: Reloading OpenSSH server daemon...
Dec 06 09:35:21 compute-1 sshd[1008]: Received SIGHUP; restarting.
Dec 06 09:35:21 compute-1 systemd[1]: Reloaded OpenSSH server daemon.
Dec 06 09:35:21 compute-1 sshd[1008]: Server listening on 0.0.0.0 port 22.
Dec 06 09:35:21 compute-1 sshd[1008]: Server listening on :: port 22.
Dec 06 09:35:21 compute-1 sudo[63735]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:23 compute-1 sudo[63891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glmvfbxjwfgrosfodfucmwekgfjuogeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013723.0577621-771-20391697795638/AnsiballZ_file.py'
Dec 06 09:35:23 compute-1 sudo[63891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:23 compute-1 python3.9[63893]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:23 compute-1 sudo[63891]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:24 compute-1 sudo[64043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsrwjkbjsytlfrncilghyulprrwvokxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013723.897249-795-20485533924408/AnsiballZ_stat.py'
Dec 06 09:35:24 compute-1 sudo[64043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:24 compute-1 python3.9[64045]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:24 compute-1 sudo[64043]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:24 compute-1 sudo[64166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szykevvhdlaswzeuhxefrvdiglfhzldd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013723.897249-795-20485533924408/AnsiballZ_copy.py'
Dec 06 09:35:24 compute-1 sudo[64166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:25 compute-1 python3.9[64168]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013723.897249-795-20485533924408/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:25 compute-1 sudo[64166]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:26 compute-1 sudo[64318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzacehueceskxspbudxgpsyfujkwwdvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013725.6717422-849-6491526726126/AnsiballZ_timezone.py'
Dec 06 09:35:26 compute-1 sudo[64318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:27 compute-1 python3.9[64320]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 06 09:35:27 compute-1 systemd[1]: Starting Time & Date Service...
Dec 06 09:35:27 compute-1 systemd[1]: Started Time & Date Service.
Dec 06 09:35:27 compute-1 sudo[64318]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:28 compute-1 sudo[64474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrptkkvqksqupdresnrfhnjvvygaxise ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013727.8320305-876-48785732525523/AnsiballZ_file.py'
Dec 06 09:35:28 compute-1 sudo[64474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:28 compute-1 python3.9[64476]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:28 compute-1 sudo[64474]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:28 compute-1 sudo[64626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbfptsrvxsfnlrkltqlwohxoyppynoco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013728.6404147-900-79958882178775/AnsiballZ_stat.py'
Dec 06 09:35:28 compute-1 sudo[64626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:29 compute-1 python3.9[64628]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:29 compute-1 sudo[64626]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:29 compute-1 sudo[64749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lenxxfjhgxkirngfedmmcihulzbxhzag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013728.6404147-900-79958882178775/AnsiballZ_copy.py'
Dec 06 09:35:29 compute-1 sudo[64749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:29 compute-1 python3.9[64751]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013728.6404147-900-79958882178775/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:29 compute-1 sudo[64749]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:30 compute-1 sudo[64902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fggbgwqogaudfnlqrreylokxhszfqujd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013730.1672492-945-261750151401702/AnsiballZ_stat.py'
Dec 06 09:35:30 compute-1 sudo[64902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:30 compute-1 python3.9[64904]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:30 compute-1 sudo[64902]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:31 compute-1 sudo[65025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgelhtpezsmxmupzyilffnfyyakukzqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013730.1672492-945-261750151401702/AnsiballZ_copy.py'
Dec 06 09:35:31 compute-1 sudo[65025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:31 compute-1 python3.9[65027]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765013730.1672492-945-261750151401702/.source.yaml _original_basename=.zarpue_k follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:31 compute-1 sudo[65025]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:32 compute-1 sudo[65177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwzaiopeoovxjwutjpcuzwycwbumxxfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013731.7201493-990-277496101914856/AnsiballZ_stat.py'
Dec 06 09:35:32 compute-1 sudo[65177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:32 compute-1 python3.9[65179]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:32 compute-1 sudo[65177]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:32 compute-1 sudo[65300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noxlnqxfqticxuoryjujroshzwpnqwri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013731.7201493-990-277496101914856/AnsiballZ_copy.py'
Dec 06 09:35:32 compute-1 sudo[65300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:32 compute-1 python3.9[65302]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013731.7201493-990-277496101914856/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:32 compute-1 sudo[65300]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:33 compute-1 sudo[65452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovypncwjfwssuuqhqsepkjlpgdqimczg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013733.1703951-1035-280082352190687/AnsiballZ_command.py'
Dec 06 09:35:33 compute-1 sudo[65452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:33 compute-1 python3.9[65454]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:33 compute-1 sudo[65452]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:34 compute-1 sudo[65605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjzwdfxzcouexansbsacrusfsawavqut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013733.952213-1059-85487381563007/AnsiballZ_command.py'
Dec 06 09:35:34 compute-1 sudo[65605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:34 compute-1 python3.9[65607]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:34 compute-1 sudo[65605]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:35 compute-1 sudo[65758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nesnjrcqfpqmbfcnadjzaxknjspfcmau ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765013734.74876-1083-259629684301246/AnsiballZ_edpm_nftables_from_files.py'
Dec 06 09:35:35 compute-1 sudo[65758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:35 compute-1 python3[65760]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 06 09:35:35 compute-1 sudo[65758]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:36 compute-1 sudo[65910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptuqgrajnkhvcpmlomnotreeqzjmynvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013735.7590027-1107-109234224364961/AnsiballZ_stat.py'
Dec 06 09:35:36 compute-1 sudo[65910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:36 compute-1 python3.9[65912]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:36 compute-1 sudo[65910]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:36 compute-1 sudo[66033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lajjpaiigcdqlqrhvfsolmsffrcixtho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013735.7590027-1107-109234224364961/AnsiballZ_copy.py'
Dec 06 09:35:36 compute-1 sudo[66033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:36 compute-1 python3.9[66035]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013735.7590027-1107-109234224364961/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:36 compute-1 sudo[66033]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:37 compute-1 sudo[66185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hacrspyutruapzrqbtsqztwqhjcoefjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013737.3152022-1152-230299439609934/AnsiballZ_stat.py'
Dec 06 09:35:37 compute-1 sudo[66185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:37 compute-1 python3.9[66187]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:37 compute-1 sudo[66185]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:38 compute-1 sudo[66308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajjgwnyupsrqlmehlqrizglqeavrcnkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013737.3152022-1152-230299439609934/AnsiballZ_copy.py'
Dec 06 09:35:38 compute-1 sudo[66308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:38 compute-1 python3.9[66310]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013737.3152022-1152-230299439609934/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:38 compute-1 sudo[66308]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:39 compute-1 sudo[66460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpgackztsdnznxutbxassqrnobeeaxva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013738.888836-1197-45251254807162/AnsiballZ_stat.py'
Dec 06 09:35:39 compute-1 sudo[66460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:39 compute-1 python3.9[66462]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:39 compute-1 sudo[66460]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:39 compute-1 sudo[66583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqjcwvkpkearvzccumooyugvmgwrzwtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013738.888836-1197-45251254807162/AnsiballZ_copy.py'
Dec 06 09:35:39 compute-1 sudo[66583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:40 compute-1 python3.9[66585]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013738.888836-1197-45251254807162/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:40 compute-1 sudo[66583]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:40 compute-1 sudo[66735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xftbbpyrrzmposiqaqedxqyogmswuwxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013740.4567454-1242-262655640109519/AnsiballZ_stat.py'
Dec 06 09:35:40 compute-1 sudo[66735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:41 compute-1 python3.9[66737]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:41 compute-1 sudo[66735]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:41 compute-1 sudo[66858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwrgwtctlmhdixwzdvmwpipwznpkporx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013740.4567454-1242-262655640109519/AnsiballZ_copy.py'
Dec 06 09:35:41 compute-1 sudo[66858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:41 compute-1 python3.9[66860]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013740.4567454-1242-262655640109519/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:41 compute-1 sudo[66858]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:42 compute-1 sudo[67010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntrxyclafrwqsvqywrxjybxqwoyjsuoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013741.9514697-1288-56654241395874/AnsiballZ_stat.py'
Dec 06 09:35:42 compute-1 sudo[67010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:42 compute-1 python3.9[67012]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:35:42 compute-1 sudo[67010]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:42 compute-1 sudo[67133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtksngbycosrjkmfhkmnsknjbnnvxkhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013741.9514697-1288-56654241395874/AnsiballZ_copy.py'
Dec 06 09:35:42 compute-1 sudo[67133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:43 compute-1 python3.9[67135]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765013741.9514697-1288-56654241395874/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:43 compute-1 sudo[67133]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:43 compute-1 sudo[67285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghmrodmpvnxrxlgkkwgyuzydvyvvxrdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013743.5150158-1332-223880992135069/AnsiballZ_file.py'
Dec 06 09:35:43 compute-1 sudo[67285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:43 compute-1 python3.9[67287]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:44 compute-1 sudo[67285]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:44 compute-1 sudo[67437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcspujwllsuqtumtpduvoqttsuhqenxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013744.2551107-1356-281152048282275/AnsiballZ_command.py'
Dec 06 09:35:44 compute-1 sudo[67437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:44 compute-1 python3.9[67439]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:35:44 compute-1 sudo[67437]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:46 compute-1 sudo[67596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gomndrjrrtqyrcqvitcmzaauqxzamyvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013745.2730985-1380-121558954962538/AnsiballZ_blockinfile.py'
Dec 06 09:35:46 compute-1 sudo[67596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:46 compute-1 python3.9[67598]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:46 compute-1 sudo[67596]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:47 compute-1 sudo[67749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbgyhsppymxbflxmmsdtqghxuneqjkwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013747.3210433-1407-51164702254573/AnsiballZ_file.py'
Dec 06 09:35:47 compute-1 sudo[67749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:47 compute-1 python3.9[67751]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:47 compute-1 sudo[67749]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:48 compute-1 sudo[67903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwsgpuquuzmgaofhcjjuoapxpucwmlsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013748.036456-1407-68899190159086/AnsiballZ_file.py'
Dec 06 09:35:48 compute-1 sudo[67903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:48 compute-1 python3.9[67905]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:35:48 compute-1 sudo[67903]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:49 compute-1 sudo[68055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aluiadwjmlrstqowphyohupejwluazbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013748.9825296-1452-22439808114261/AnsiballZ_mount.py'
Dec 06 09:35:49 compute-1 sudo[68055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:49 compute-1 python3.9[68057]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 06 09:35:49 compute-1 sudo[68055]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:50 compute-1 sudo[68208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoghfokelzpnxsjdmjendfhklrxyniqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013750.037018-1452-156651273362991/AnsiballZ_mount.py'
Dec 06 09:35:50 compute-1 sudo[68208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:50 compute-1 python3.9[68210]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 06 09:35:50 compute-1 sudo[68208]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:51 compute-1 sshd-session[59002]: Connection closed by 192.168.122.30 port 59064
Dec 06 09:35:51 compute-1 sshd-session[58999]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:35:51 compute-1 systemd[1]: session-15.scope: Deactivated successfully.
Dec 06 09:35:51 compute-1 systemd[1]: session-15.scope: Consumed 38.870s CPU time.
Dec 06 09:35:51 compute-1 systemd-logind[788]: Session 15 logged out. Waiting for processes to exit.
Dec 06 09:35:51 compute-1 systemd-logind[788]: Removed session 15.
Dec 06 09:35:56 compute-1 sshd-session[68236]: Accepted publickey for zuul from 192.168.122.30 port 55098 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:35:56 compute-1 systemd-logind[788]: New session 16 of user zuul.
Dec 06 09:35:56 compute-1 systemd[1]: Started Session 16 of User zuul.
Dec 06 09:35:56 compute-1 sshd-session[68236]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:35:57 compute-1 sudo[68389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssyxcrcztyaatujycwqlpbhwbfkgfmys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013756.5559826-19-125399642901939/AnsiballZ_tempfile.py'
Dec 06 09:35:57 compute-1 sudo[68389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:57 compute-1 python3.9[68391]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 06 09:35:57 compute-1 sudo[68389]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:57 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 06 09:35:58 compute-1 sudo[68543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enibqfkcfxndhbjvflcfrdoyqqodxgpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013757.5833094-55-14134552575767/AnsiballZ_stat.py'
Dec 06 09:35:58 compute-1 sudo[68543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:58 compute-1 python3.9[68545]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:35:58 compute-1 sudo[68543]: pam_unix(sudo:session): session closed for user root
Dec 06 09:35:59 compute-1 sudo[68695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqfzfehxpxanysyzvitazivtkbuukgwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013758.5863507-85-106295938179282/AnsiballZ_setup.py'
Dec 06 09:35:59 compute-1 sudo[68695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:35:59 compute-1 python3.9[68697]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:35:59 compute-1 sudo[68695]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:00 compute-1 sudo[68847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyudjvehrjvnvrlntsrnpxtjfxmmjkoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013759.7866857-110-237432636086091/AnsiballZ_blockinfile.py'
Dec 06 09:36:00 compute-1 sudo[68847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:00 compute-1 python3.9[68849]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCtvqYC0W0zPSX/plyJvm0q1VGDScYTNlcCdllukOe81JRfU3GhVusPZOX0xRSaLP/lmXtfqWcbBRCkLsmFrAo2EHn1CMqMr5WkhY4+rgApF+MGLDOUo57tlKZLPIwdL0SSY/Qv8lBfrqr7LUDZ7fTTTbqTzim/bncxg/u0KxSWBdvjfmYi13SwO65wDkFqSVYa3h8DNij6cRRjQ0fJuJ9Da860hmMnqo9GJMU6dq3zMXXn3YfuF4E4M0UQdlWmVW4EwBTzsfA1XYbSpW7VdRJw6esB4vZ9/Succj+XZiANoDqL9gXSEjNXVVWVbL/7aGJJF9LLQ3VVxmHdbYs1NcTI6Yy9d61zDJHnK/nlYHMhmAHxiDsZEpv0xF72LLzaI86xxvnbx4eUpnyW6LnKiUCYUAUrWIMpLiIbWUxeIoYmj9rqLhwlo5kCy7WdCYYEMTtGI53oIyU0EbXf/r4WAuzmqpVRPyc2Sd5tYD4aXh1JZLUcZy+NLR0Y4SA8RflKFcs=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFDJYF6pUvFgGUbY2QEOHAq7ZEhRQJUqPTVPOuTyb476
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPJ19afQPeSMtr3O9L1fe5+bNzTAsOOCA5fLihUdryDYc29KKD+0XABHKIvqeefcCsIBjZRA//9OzCUftfvXK9A=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAiB67qk/R3IfGpcAH1Ojopc8KX94De+Kxs31cKQLD04X+4QRXPRdMxU85LOhN58eKoHaBi8cgqk7+dvRypGD5vbtbRN9r0VN7tGwiSQTlVFbEuhn0AEbnRwNAMWEEMHO9kEjufP4N2zEEhtQBXy9oO2tMX3+BX4Z3YZZMQyZUgohdBHp2VCul9VdRuo0oHSr8HHm0nN61dMjalnThmgkGAu5hG8qhkWT4i9hroSKBsR5kVBUFTqdXekYkVy4YIYfM2lBXiMOFHtvr1a+KOyIfgWMb7GBPW7oKqtzCfVgSbGaUhSvGzs1OWt3U/PjjapIlmDnwD5ukzVxWV5ldh0vA48tXh5R1wqAoN5/Y/RiAKaY2kd/fvtkhvVDGZluXOz5jJ02IFHm+v4dP3Ig8YOuS5BEkWFuJHkblW0t/+4siTHWwmGEuvUI6y8Gb2pGcBKsWCJtLePYzT09IAmrjwO0jAgbWy0nvCZ+SKlbBBrXP6OgNgMkA+GH9iGOl6FOuRok=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGYNj3LmNvR0emoQHuuy9NKXPivs/dznunVy8GExnJl8
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJhKmGSvg8FMw16qKPzk6Pyj+OHkN3bmk20mts1PdCRcNRnn9sT1DgI6U8Aze1tjGPujT4eDL+Y9r/hsrfM4qDc=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDneZurSARwLaZA1xEymzXlvVAPvP8u0PCrqXuMYD5ewImDDChRITnk4XHKT/DUfrSJf9/7oJsddEbLRjhCtedqrMZsCkWz1BxtCmPBuvz2LfFhEn27TjqYLctOVGigQGsj6ILvPOzzLiapd93yApWDmH6P0un/ltmdM0iZLygNpzG3HLF8STBXzlo/8slci69Em7XppcrOpl1TS7DaVlpNcRQvo9pFuIrbMD9g0DOdMwk5YCH6g7OzGWqq0gt0YUOztmsqxWHKav3E0SXAD/vkgRc/1ZCNGFNSvf0dIgimCF3xlNWrppnvNgQ1BRqiQ7RArlOp1bVg0Ugdce6f4TIrq36Ois2U5+/myF5WQ7l9hRMRvoP64hSSsRAIDobTI/zMStUP3iZPFngxDxwQtpydHfFGywBL9811c42U7JsGxE8890uOIDk/oOkyhSH6KHQCPFjmKBJ98nT01lgnXyFSNOqds6QOYBasUWNFWd2wS7YpTheGlVVM8bk/gB4K2L0=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOMkn8zp09tRuEaH/bUoP0rYj+dziM1KcqMKxOgM9K1U
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCrMdvJJYP0cflC7RDFsxwr66nSp9R7QU726CAfJcKLw6vHh8Z9Lw5wLH0kiaSpsb6SAPffloplHEDiwTOkghOc=
                                             create=True mode=0644 path=/tmp/ansible.5_6wmhzq state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:00 compute-1 sudo[68847]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:01 compute-1 sudo[68999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcngdlcibsqzsgaswgeudddpekkloaua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013760.731081-134-146822614582165/AnsiballZ_command.py'
Dec 06 09:36:01 compute-1 sudo[68999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:01 compute-1 python3.9[69001]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.5_6wmhzq' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:36:01 compute-1 sudo[68999]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:02 compute-1 sudo[69153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gubmujgfzkxxonammwvpfllrzacfpphf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013761.7792044-158-43228693054559/AnsiballZ_file.py'
Dec 06 09:36:02 compute-1 sudo[69153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:02 compute-1 python3.9[69155]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.5_6wmhzq state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:02 compute-1 sudo[69153]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:02 compute-1 sshd-session[68239]: Connection closed by 192.168.122.30 port 55098
Dec 06 09:36:02 compute-1 sshd-session[68236]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:36:02 compute-1 systemd[1]: session-16.scope: Deactivated successfully.
Dec 06 09:36:02 compute-1 systemd[1]: session-16.scope: Consumed 3.677s CPU time.
Dec 06 09:36:02 compute-1 systemd-logind[788]: Session 16 logged out. Waiting for processes to exit.
Dec 06 09:36:02 compute-1 systemd-logind[788]: Removed session 16.
Dec 06 09:36:08 compute-1 sshd-session[69180]: Accepted publickey for zuul from 192.168.122.30 port 58100 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:36:08 compute-1 systemd-logind[788]: New session 17 of user zuul.
Dec 06 09:36:08 compute-1 systemd[1]: Started Session 17 of User zuul.
Dec 06 09:36:08 compute-1 sshd-session[69180]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:36:09 compute-1 python3.9[69333]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:36:10 compute-1 sudo[69487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yonsrlxesmfecvinsoxxapazezrncguv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013769.8776162-57-157827574618469/AnsiballZ_systemd.py'
Dec 06 09:36:10 compute-1 sudo[69487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:10 compute-1 python3.9[69489]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 06 09:36:10 compute-1 sudo[69487]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:11 compute-1 sudo[69641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euhgfosrjetzmxxatpoxpjlyycsgnomn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013771.1794581-81-160524501325932/AnsiballZ_systemd.py'
Dec 06 09:36:11 compute-1 sudo[69641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:11 compute-1 python3.9[69643]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:36:11 compute-1 sudo[69641]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:12 compute-1 sudo[69794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skmldxeajtkjalmhcbdircnilsodqfwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013772.2208989-108-41466736879841/AnsiballZ_command.py'
Dec 06 09:36:12 compute-1 sudo[69794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:12 compute-1 python3.9[69796]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:36:12 compute-1 sudo[69794]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:13 compute-1 sudo[69947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbrluodkcpsggqodkblrbljagsbbunyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013773.2045374-132-281198449571117/AnsiballZ_stat.py'
Dec 06 09:36:13 compute-1 sudo[69947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:13 compute-1 python3.9[69949]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:36:13 compute-1 sudo[69947]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:14 compute-1 sudo[70101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkcffpptmxixctdojomzoinujklhlfin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013774.1322515-156-64284261690455/AnsiballZ_command.py'
Dec 06 09:36:14 compute-1 sudo[70101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:14 compute-1 python3.9[70103]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:36:14 compute-1 sudo[70101]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:15 compute-1 sudo[70256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiqbtccofhqaaekgkosnioytnypkedeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013774.9419763-180-264729265748437/AnsiballZ_file.py'
Dec 06 09:36:15 compute-1 sudo[70256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:15 compute-1 python3.9[70258]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:15 compute-1 sudo[70256]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:16 compute-1 sshd-session[69183]: Connection closed by 192.168.122.30 port 58100
Dec 06 09:36:16 compute-1 sshd-session[69180]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:36:16 compute-1 systemd[1]: session-17.scope: Deactivated successfully.
Dec 06 09:36:16 compute-1 systemd[1]: session-17.scope: Consumed 4.888s CPU time.
Dec 06 09:36:16 compute-1 systemd-logind[788]: Session 17 logged out. Waiting for processes to exit.
Dec 06 09:36:16 compute-1 systemd-logind[788]: Removed session 17.
Dec 06 09:36:21 compute-1 sshd-session[70283]: Accepted publickey for zuul from 192.168.122.30 port 37746 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:36:21 compute-1 systemd-logind[788]: New session 18 of user zuul.
Dec 06 09:36:21 compute-1 systemd[1]: Started Session 18 of User zuul.
Dec 06 09:36:21 compute-1 sshd-session[70283]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:36:22 compute-1 python3.9[70436]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:36:23 compute-1 sudo[70590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pntwzlkolvzdiqreyyekqilkwtkssafk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013783.312268-63-84987453179191/AnsiballZ_setup.py'
Dec 06 09:36:23 compute-1 sudo[70590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:23 compute-1 python3.9[70592]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:36:24 compute-1 sudo[70590]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:24 compute-1 sudo[70674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sshoictahpfharhbsoxwwrxemkjqsmmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765013783.312268-63-84987453179191/AnsiballZ_dnf.py'
Dec 06 09:36:24 compute-1 sudo[70674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:24 compute-1 python3.9[70676]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:36:26 compute-1 sudo[70674]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:27 compute-1 python3.9[70827]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:36:28 compute-1 python3.9[70978]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:36:29 compute-1 python3.9[71128]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:36:29 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:36:29 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:36:30 compute-1 python3.9[71279]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:36:30 compute-1 sshd-session[70286]: Connection closed by 192.168.122.30 port 37746
Dec 06 09:36:30 compute-1 sshd-session[70283]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:36:30 compute-1 systemd[1]: session-18.scope: Deactivated successfully.
Dec 06 09:36:30 compute-1 systemd[1]: session-18.scope: Consumed 6.677s CPU time.
Dec 06 09:36:30 compute-1 systemd-logind[788]: Session 18 logged out. Waiting for processes to exit.
Dec 06 09:36:30 compute-1 systemd-logind[788]: Removed session 18.
Dec 06 09:36:38 compute-1 chronyd[58518]: Selected source 23.133.168.246 (pool.ntp.org)
Dec 06 09:36:41 compute-1 sshd-session[71304]: Accepted publickey for zuul from 38.102.83.98 port 58744 ssh2: RSA SHA256:spwPcL19sPHC+yJA+ECEA4UNmpshOiR8KfgtTbViJeA
Dec 06 09:36:41 compute-1 systemd-logind[788]: New session 19 of user zuul.
Dec 06 09:36:41 compute-1 systemd[1]: Started Session 19 of User zuul.
Dec 06 09:36:41 compute-1 sshd-session[71304]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:36:42 compute-1 sudo[71380]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byrpjhrbcqnlbzveokxvmfpfsxtiemjb ; /usr/bin/python3'
Dec 06 09:36:42 compute-1 sudo[71380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:42 compute-1 useradd[71384]: new group: name=ceph-admin, GID=42478
Dec 06 09:36:42 compute-1 useradd[71384]: new user: name=ceph-admin, UID=42477, GID=42478, home=/home/ceph-admin, shell=/bin/bash, from=none
Dec 06 09:36:42 compute-1 sudo[71380]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:43 compute-1 sudo[71466]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbjoppkkkrdwlyfgvstudovvirsnfqva ; /usr/bin/python3'
Dec 06 09:36:43 compute-1 sudo[71466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:43 compute-1 sudo[71466]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:43 compute-1 sudo[71539]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqljqyvmwbbftaymlkipdliyijfzsvfi ; /usr/bin/python3'
Dec 06 09:36:43 compute-1 sudo[71539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:43 compute-1 sudo[71539]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:44 compute-1 sudo[71589]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggcnotglrbhfkdzgvdwtwbgsisitwyvg ; /usr/bin/python3'
Dec 06 09:36:44 compute-1 sudo[71589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:45 compute-1 sudo[71589]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:45 compute-1 sudo[71615]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iphsrcjfvrfqwwholomcfpsazmodxaqo ; /usr/bin/python3'
Dec 06 09:36:45 compute-1 sudo[71615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:45 compute-1 sudo[71615]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:45 compute-1 sudo[71641]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgelziamwahpeieyejdsxhfleetlppkb ; /usr/bin/python3'
Dec 06 09:36:45 compute-1 sudo[71641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:46 compute-1 sudo[71641]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:46 compute-1 sudo[71667]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jotebbxhrvnbcziathrtcrrjrkxxlpik ; /usr/bin/python3'
Dec 06 09:36:46 compute-1 sudo[71667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:46 compute-1 sudo[71667]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:46 compute-1 sudo[71745]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brmzbbzgfzzlzcovsshznumptznhyubi ; /usr/bin/python3'
Dec 06 09:36:46 compute-1 sudo[71745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:47 compute-1 sudo[71745]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:47 compute-1 sudo[71818]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fitnjjkidtvxhlefzwgzsqtvdanazgmz ; /usr/bin/python3'
Dec 06 09:36:47 compute-1 sudo[71818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:47 compute-1 sudo[71818]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:48 compute-1 sudo[71920]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toeqjdlgzukejxqigbpdmvjtipjjeqwa ; /usr/bin/python3'
Dec 06 09:36:48 compute-1 sudo[71920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:48 compute-1 sudo[71920]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:48 compute-1 sudo[71993]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlofgwgwdgbdtbrufjlayloxhgyutyrd ; /usr/bin/python3'
Dec 06 09:36:48 compute-1 sudo[71993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:48 compute-1 sudo[71993]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:49 compute-1 sudo[72043]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uresxehzxorqdofcbeonnqzqfqxhephs ; /usr/bin/python3'
Dec 06 09:36:49 compute-1 sudo[72043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:49 compute-1 python3[72045]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:36:50 compute-1 sudo[72043]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:50 compute-1 sudo[72138]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gonejumbozvxozkxceojhxvpdaiyjmxt ; /usr/bin/python3'
Dec 06 09:36:51 compute-1 sudo[72138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:51 compute-1 python3[72140]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 06 09:36:52 compute-1 sudo[72138]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:52 compute-1 sudo[72165]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mewfdvasluaoahtbzwqyhhbywattzrls ; /usr/bin/python3'
Dec 06 09:36:52 compute-1 sudo[72165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:53 compute-1 python3[72167]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 06 09:36:53 compute-1 sudo[72165]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:53 compute-1 sudo[72191]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxzhavxdsegjflwzqstzungykuzwaqrx ; /usr/bin/python3'
Dec 06 09:36:53 compute-1 sudo[72191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:53 compute-1 python3[72193]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G
                                          losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:36:53 compute-1 kernel: loop: module loaded
Dec 06 09:36:53 compute-1 kernel: loop3: detected capacity change from 0 to 41943040
Dec 06 09:36:53 compute-1 sudo[72191]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:53 compute-1 sudo[72226]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpgibvdbqkodudwzmsnsvsuymplwzgrq ; /usr/bin/python3'
Dec 06 09:36:53 compute-1 sudo[72226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:53 compute-1 python3[72228]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                          vgcreate ceph_vg0 /dev/loop3
                                          lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:36:53 compute-1 lvm[72231]: PV /dev/loop3 not used.
Dec 06 09:36:54 compute-1 lvm[72233]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 09:36:54 compute-1 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Dec 06 09:36:54 compute-1 lvm[72243]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 09:36:54 compute-1 lvm[72243]: VG ceph_vg0 finished
Dec 06 09:36:54 compute-1 lvm[72241]:   1 logical volume(s) in volume group "ceph_vg0" now active
Dec 06 09:36:54 compute-1 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Dec 06 09:36:54 compute-1 sudo[72226]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:54 compute-1 sudo[72319]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kktjoynahahjpjbwjfslwaqiopcrftlq ; /usr/bin/python3'
Dec 06 09:36:54 compute-1 sudo[72319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:54 compute-1 python3[72321]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 06 09:36:54 compute-1 sudo[72319]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:54 compute-1 sudo[72392]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyorlxyjtgwxtayhtsnyhxrrplxevovl ; /usr/bin/python3'
Dec 06 09:36:54 compute-1 sudo[72392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:54 compute-1 python3[72394]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765013814.3516102-36828-265389727803626/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:36:55 compute-1 sudo[72392]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:55 compute-1 sudo[72442]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukmojatkdigwqfjautrjaumwauzfvtvq ; /usr/bin/python3'
Dec 06 09:36:55 compute-1 sudo[72442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:36:55 compute-1 python3[72444]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:36:55 compute-1 systemd[1]: Reloading.
Dec 06 09:36:55 compute-1 systemd-rc-local-generator[72477]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:36:55 compute-1 systemd-sysv-generator[72481]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:36:56 compute-1 systemd[1]: Starting Ceph OSD losetup...
Dec 06 09:36:56 compute-1 bash[72485]: /dev/loop3: [64513]:4327945 (/var/lib/ceph-osd-0.img)
Dec 06 09:36:56 compute-1 systemd[1]: Finished Ceph OSD losetup.
Dec 06 09:36:56 compute-1 sudo[72442]: pam_unix(sudo:session): session closed for user root
Dec 06 09:36:56 compute-1 lvm[72486]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 09:36:56 compute-1 lvm[72486]: VG ceph_vg0 finished
Dec 06 09:36:58 compute-1 python3[72510]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:37:06 compute-1 sshd-session[72554]: Connection closed by 222.88.225.195 port 47384
Dec 06 09:38:32 compute-1 sshd-session[72556]: Accepted publickey for ceph-admin from 192.168.122.100 port 59048 ssh2: RSA SHA256:Gxeh0g0CuyN5zOpDUv+8o0JynyC1ASnaMny1857KGxo
Dec 06 09:38:32 compute-1 systemd[1]: Created slice User Slice of UID 42477.
Dec 06 09:38:32 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec 06 09:38:32 compute-1 systemd-logind[788]: New session 20 of user ceph-admin.
Dec 06 09:38:32 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec 06 09:38:32 compute-1 systemd[1]: Starting User Manager for UID 42477...
Dec 06 09:38:32 compute-1 systemd[72560]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:38:32 compute-1 systemd[72560]: Queued start job for default target Main User Target.
Dec 06 09:38:32 compute-1 systemd[72560]: Created slice User Application Slice.
Dec 06 09:38:32 compute-1 systemd[72560]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 09:38:32 compute-1 systemd[72560]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 09:38:32 compute-1 systemd[72560]: Reached target Paths.
Dec 06 09:38:32 compute-1 systemd[72560]: Reached target Timers.
Dec 06 09:38:32 compute-1 systemd[72560]: Starting D-Bus User Message Bus Socket...
Dec 06 09:38:32 compute-1 systemd[72560]: Starting Create User's Volatile Files and Directories...
Dec 06 09:38:32 compute-1 sshd-session[72574]: Accepted publickey for ceph-admin from 192.168.122.100 port 59058 ssh2: RSA SHA256:Gxeh0g0CuyN5zOpDUv+8o0JynyC1ASnaMny1857KGxo
Dec 06 09:38:32 compute-1 systemd[72560]: Listening on D-Bus User Message Bus Socket.
Dec 06 09:38:32 compute-1 systemd[72560]: Finished Create User's Volatile Files and Directories.
Dec 06 09:38:32 compute-1 systemd[72560]: Reached target Sockets.
Dec 06 09:38:32 compute-1 systemd[72560]: Reached target Basic System.
Dec 06 09:38:32 compute-1 systemd[72560]: Reached target Main User Target.
Dec 06 09:38:32 compute-1 systemd[72560]: Startup finished in 159ms.
Dec 06 09:38:32 compute-1 systemd[1]: Started User Manager for UID 42477.
Dec 06 09:38:32 compute-1 systemd[1]: Started Session 20 of User ceph-admin.
Dec 06 09:38:32 compute-1 sshd-session[72556]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:38:32 compute-1 systemd-logind[788]: New session 22 of user ceph-admin.
Dec 06 09:38:33 compute-1 systemd[1]: Started Session 22 of User ceph-admin.
Dec 06 09:38:33 compute-1 sshd-session[72574]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:38:33 compute-1 sudo[72581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:38:33 compute-1 sudo[72581]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:33 compute-1 sudo[72581]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:33 compute-1 sshd-session[72606]: Accepted publickey for ceph-admin from 192.168.122.100 port 59060 ssh2: RSA SHA256:Gxeh0g0CuyN5zOpDUv+8o0JynyC1ASnaMny1857KGxo
Dec 06 09:38:33 compute-1 systemd-logind[788]: New session 23 of user ceph-admin.
Dec 06 09:38:33 compute-1 systemd[1]: Started Session 23 of User ceph-admin.
Dec 06 09:38:33 compute-1 sshd-session[72606]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:38:33 compute-1 sudo[72610]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-1
Dec 06 09:38:33 compute-1 sudo[72610]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:33 compute-1 sudo[72610]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:33 compute-1 sshd-session[72635]: Accepted publickey for ceph-admin from 192.168.122.100 port 59074 ssh2: RSA SHA256:Gxeh0g0CuyN5zOpDUv+8o0JynyC1ASnaMny1857KGxo
Dec 06 09:38:33 compute-1 systemd-logind[788]: New session 24 of user ceph-admin.
Dec 06 09:38:33 compute-1 systemd[1]: Started Session 24 of User ceph-admin.
Dec 06 09:38:33 compute-1 sshd-session[72635]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:38:33 compute-1 sudo[72639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Dec 06 09:38:33 compute-1 sudo[72639]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:33 compute-1 sudo[72639]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:34 compute-1 sshd-session[72664]: Accepted publickey for ceph-admin from 192.168.122.100 port 59076 ssh2: RSA SHA256:Gxeh0g0CuyN5zOpDUv+8o0JynyC1ASnaMny1857KGxo
Dec 06 09:38:34 compute-1 systemd-logind[788]: New session 25 of user ceph-admin.
Dec 06 09:38:34 compute-1 systemd[1]: Started Session 25 of User ceph-admin.
Dec 06 09:38:34 compute-1 sshd-session[72664]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:38:34 compute-1 sudo[72668]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:38:34 compute-1 sudo[72668]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:34 compute-1 sudo[72668]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:34 compute-1 sshd-session[72693]: Accepted publickey for ceph-admin from 192.168.122.100 port 59082 ssh2: RSA SHA256:Gxeh0g0CuyN5zOpDUv+8o0JynyC1ASnaMny1857KGxo
Dec 06 09:38:34 compute-1 systemd-logind[788]: New session 26 of user ceph-admin.
Dec 06 09:38:34 compute-1 systemd[1]: Started Session 26 of User ceph-admin.
Dec 06 09:38:34 compute-1 sshd-session[72693]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:38:34 compute-1 sudo[72697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:38:34 compute-1 sudo[72697]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:34 compute-1 sudo[72697]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:34 compute-1 sshd-session[72722]: Accepted publickey for ceph-admin from 192.168.122.100 port 59092 ssh2: RSA SHA256:Gxeh0g0CuyN5zOpDUv+8o0JynyC1ASnaMny1857KGxo
Dec 06 09:38:34 compute-1 systemd-logind[788]: New session 27 of user ceph-admin.
Dec 06 09:38:34 compute-1 systemd[1]: Started Session 27 of User ceph-admin.
Dec 06 09:38:34 compute-1 sshd-session[72722]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:38:35 compute-1 sudo[72726]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Dec 06 09:38:35 compute-1 sudo[72726]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:35 compute-1 sudo[72726]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:35 compute-1 sshd-session[72751]: Accepted publickey for ceph-admin from 192.168.122.100 port 59102 ssh2: RSA SHA256:Gxeh0g0CuyN5zOpDUv+8o0JynyC1ASnaMny1857KGxo
Dec 06 09:38:35 compute-1 systemd-logind[788]: New session 28 of user ceph-admin.
Dec 06 09:38:35 compute-1 systemd[1]: Started Session 28 of User ceph-admin.
Dec 06 09:38:35 compute-1 sshd-session[72751]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:38:35 compute-1 sudo[72755]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:38:35 compute-1 sudo[72755]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:35 compute-1 sudo[72755]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:35 compute-1 sshd-session[72780]: Accepted publickey for ceph-admin from 192.168.122.100 port 59116 ssh2: RSA SHA256:Gxeh0g0CuyN5zOpDUv+8o0JynyC1ASnaMny1857KGxo
Dec 06 09:38:35 compute-1 systemd-logind[788]: New session 29 of user ceph-admin.
Dec 06 09:38:35 compute-1 systemd[1]: Started Session 29 of User ceph-admin.
Dec 06 09:38:35 compute-1 sshd-session[72780]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:38:35 compute-1 sudo[72784]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Dec 06 09:38:35 compute-1 sudo[72784]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:35 compute-1 sudo[72784]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:36 compute-1 sshd-session[72809]: Accepted publickey for ceph-admin from 192.168.122.100 port 59122 ssh2: RSA SHA256:Gxeh0g0CuyN5zOpDUv+8o0JynyC1ASnaMny1857KGxo
Dec 06 09:38:36 compute-1 systemd-logind[788]: New session 30 of user ceph-admin.
Dec 06 09:38:36 compute-1 systemd[1]: Started Session 30 of User ceph-admin.
Dec 06 09:38:36 compute-1 sshd-session[72809]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:38:37 compute-1 sshd-session[72836]: Accepted publickey for ceph-admin from 192.168.122.100 port 59134 ssh2: RSA SHA256:Gxeh0g0CuyN5zOpDUv+8o0JynyC1ASnaMny1857KGxo
Dec 06 09:38:37 compute-1 systemd-logind[788]: New session 31 of user ceph-admin.
Dec 06 09:38:37 compute-1 systemd[1]: Started Session 31 of User ceph-admin.
Dec 06 09:38:37 compute-1 sshd-session[72836]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:38:37 compute-1 sudo[72840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Dec 06 09:38:37 compute-1 sudo[72840]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:37 compute-1 sudo[72840]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:37 compute-1 sshd-session[72865]: Accepted publickey for ceph-admin from 192.168.122.100 port 59142 ssh2: RSA SHA256:Gxeh0g0CuyN5zOpDUv+8o0JynyC1ASnaMny1857KGxo
Dec 06 09:38:37 compute-1 systemd-logind[788]: New session 32 of user ceph-admin.
Dec 06 09:38:37 compute-1 systemd[1]: Started Session 32 of User ceph-admin.
Dec 06 09:38:37 compute-1 sshd-session[72865]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:38:37 compute-1 sudo[72869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-1
Dec 06 09:38:37 compute-1 sudo[72869]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:38 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:38:38 compute-1 sudo[72869]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:38 compute-1 sudo[72915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:38:38 compute-1 sudo[72915]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:38 compute-1 sudo[72915]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:38 compute-1 sudo[72940]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Dec 06 09:38:38 compute-1 sudo[72940]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:38 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:38:38 compute-1 sudo[72940]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:38 compute-1 sudo[72985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:38:38 compute-1 sudo[72985]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:38 compute-1 sudo[72985]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:38 compute-1 sudo[73010]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 06 09:38:38 compute-1 sudo[73010]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:39 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:38:39 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:38:39 compute-1 sudo[73010]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:39 compute-1 sudo[73073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:38:39 compute-1 sudo[73073]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:39 compute-1 sudo[73073]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:39 compute-1 sudo[73098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:38:39 compute-1 sudo[73098]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:39 compute-1 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73135 (sysctl)
Dec 06 09:38:40 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:38:40 compute-1 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 06 09:38:40 compute-1 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 06 09:38:40 compute-1 sudo[73098]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:40 compute-1 sudo[73157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:38:40 compute-1 sudo[73157]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:40 compute-1 sudo[73157]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:40 compute-1 sudo[73182]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Dec 06 09:38:40 compute-1 sudo[73182]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:40 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:38:41 compute-1 sudo[73182]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:41 compute-1 sudo[73225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:38:41 compute-1 sudo[73225]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:41 compute-1 sudo[73225]: pam_unix(sudo:session): session closed for user root
Dec 06 09:38:41 compute-1 sudo[73250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258 -- inventory --format=json-pretty --filter-for-batch
Dec 06 09:38:41 compute-1 sudo[73250]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:38:41 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:38:41 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:38:44 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat363875030-lower\x2dmapped.mount: Deactivated successfully.
Dec 06 09:39:12 compute-1 podman[73310]: 2025-12-06 09:39:12.666852424 +0000 UTC m=+31.158415124 container create 81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_shaw, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:39:12 compute-1 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck224748378-merged.mount: Deactivated successfully.
Dec 06 09:39:12 compute-1 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec 06 09:39:12 compute-1 systemd[1]: Started libpod-conmon-81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6.scope.
Dec 06 09:39:12 compute-1 podman[73310]: 2025-12-06 09:39:12.645746346 +0000 UTC m=+31.137309076 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:12 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:39:12 compute-1 podman[73310]: 2025-12-06 09:39:12.7829792 +0000 UTC m=+31.274541940 container init 81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_shaw, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec 06 09:39:12 compute-1 podman[73310]: 2025-12-06 09:39:12.791273536 +0000 UTC m=+31.282836236 container start 81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_shaw, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:39:12 compute-1 podman[73310]: 2025-12-06 09:39:12.795109018 +0000 UTC m=+31.286671718 container attach 81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_shaw, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:39:12 compute-1 awesome_shaw[73370]: 167 167
Dec 06 09:39:12 compute-1 systemd[1]: libpod-81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6.scope: Deactivated successfully.
Dec 06 09:39:12 compute-1 conmon[73370]: conmon 81358b6ab17a0c17589d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6.scope/container/memory.events
Dec 06 09:39:12 compute-1 podman[73310]: 2025-12-06 09:39:12.799311613 +0000 UTC m=+31.290874313 container died 81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_shaw, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Dec 06 09:39:12 compute-1 systemd[1]: var-lib-containers-storage-overlay-3390ceddcaa2e212e5838b73c4611355f5f37077196ff2af454b1a4d06d97f49-merged.mount: Deactivated successfully.
Dec 06 09:39:12 compute-1 podman[73310]: 2025-12-06 09:39:12.850888853 +0000 UTC m=+31.342451573 container remove 81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_shaw, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Dec 06 09:39:12 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:39:12 compute-1 systemd[1]: libpod-conmon-81358b6ab17a0c17589dff90f711e3456d530a0f9c43daee99c79c2adfb69af6.scope: Deactivated successfully.
Dec 06 09:39:13 compute-1 podman[73393]: 2025-12-06 09:39:13.039484058 +0000 UTC m=+0.031054122 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:13 compute-1 podman[73393]: 2025-12-06 09:39:13.135787751 +0000 UTC m=+0.127357815 container create 37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goldstine, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Dec 06 09:39:13 compute-1 systemd[1]: Started libpod-conmon-37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c.scope.
Dec 06 09:39:13 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:39:13 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b62d2f8ad36f61b24d5e4fac46ad435c02ec7b7627f4c92d752d96dc35f9e04/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:13 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b62d2f8ad36f61b24d5e4fac46ad435c02ec7b7627f4c92d752d96dc35f9e04/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:13 compute-1 podman[73393]: 2025-12-06 09:39:13.436139202 +0000 UTC m=+0.427709276 container init 37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goldstine, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 06 09:39:13 compute-1 podman[73393]: 2025-12-06 09:39:13.448930913 +0000 UTC m=+0.440500947 container start 37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goldstine, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 06 09:39:13 compute-1 podman[73393]: 2025-12-06 09:39:13.490501067 +0000 UTC m=+0.482071141 container attach 37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goldstine, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]: [
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:     {
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:         "available": false,
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:         "being_replaced": false,
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:         "ceph_device_lvm": false,
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:         "lsm_data": {},
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:         "lvs": [],
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:         "path": "/dev/sr0",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:         "rejected_reasons": [
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "Has a FileSystem",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "Insufficient space (<5GB)"
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:         ],
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:         "sys_api": {
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "actuators": null,
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "device_nodes": [
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:                 "sr0"
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             ],
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "devname": "sr0",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "human_readable_size": "482.00 KB",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "id_bus": "ata",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "model": "QEMU DVD-ROM",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "nr_requests": "2",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "parent": "/dev/sr0",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "partitions": {},
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "path": "/dev/sr0",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "removable": "1",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "rev": "2.5+",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "ro": "0",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "rotational": "1",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "sas_address": "",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "sas_device_handle": "",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "scheduler_mode": "mq-deadline",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "sectors": 0,
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "sectorsize": "2048",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "size": 493568.0,
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "support_discard": "2048",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "type": "disk",
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:             "vendor": "QEMU"
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:         }
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]:     }
Dec 06 09:39:14 compute-1 amazing_goldstine[73408]: ]
Dec 06 09:39:14 compute-1 systemd[1]: libpod-37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c.scope: Deactivated successfully.
Dec 06 09:39:14 compute-1 podman[73393]: 2025-12-06 09:39:14.347697036 +0000 UTC m=+1.339267070 container died 37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goldstine, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 06 09:39:14 compute-1 systemd[1]: var-lib-containers-storage-overlay-2b62d2f8ad36f61b24d5e4fac46ad435c02ec7b7627f4c92d752d96dc35f9e04-merged.mount: Deactivated successfully.
Dec 06 09:39:14 compute-1 podman[73393]: 2025-12-06 09:39:14.405298133 +0000 UTC m=+1.396868167 container remove 37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=amazing_goldstine, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Dec 06 09:39:14 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:39:14 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:39:14 compute-1 systemd[1]: libpod-conmon-37a0c4bd0e1d05c63bbc777d99062257deeadec2a53e34a65fcf3b447971ad3c.scope: Deactivated successfully.
Dec 06 09:39:14 compute-1 sudo[73250]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:14 compute-1 sudo[74293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 09:39:14 compute-1 sudo[74293]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:14 compute-1 sudo[74293]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:14 compute-1 sudo[74318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph
Dec 06 09:39:14 compute-1 sudo[74318]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:14 compute-1 sudo[74318]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:14 compute-1 sudo[74343]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:39:14 compute-1 sudo[74343]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:14 compute-1 sudo[74343]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:14 compute-1 sudo[74368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:39:14 compute-1 sudo[74368]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:14 compute-1 sudo[74368]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:14 compute-1 sudo[74393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:39:14 compute-1 sudo[74393]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:14 compute-1 sudo[74393]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:15 compute-1 sudo[74441]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:39:15 compute-1 sudo[74441]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:15 compute-1 sudo[74441]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:15 compute-1 sudo[74466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:39:15 compute-1 sudo[74466]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:15 compute-1 sudo[74466]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:15 compute-1 sudo[74491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 09:39:15 compute-1 sudo[74491]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:15 compute-1 sudo[74491]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:15 compute-1 sudo[74516]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config
Dec 06 09:39:15 compute-1 sudo[74516]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:15 compute-1 sudo[74516]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:15 compute-1 sudo[74541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config
Dec 06 09:39:15 compute-1 sudo[74541]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:15 compute-1 sudo[74541]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:15 compute-1 sudo[74566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:39:15 compute-1 sudo[74566]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:15 compute-1 sudo[74566]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:15 compute-1 sudo[74591]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:39:15 compute-1 sudo[74591]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:15 compute-1 sudo[74591]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:15 compute-1 sudo[74616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:39:15 compute-1 sudo[74616]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:15 compute-1 sudo[74616]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:15 compute-1 sudo[74664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:39:15 compute-1 sudo[74664]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:15 compute-1 sudo[74664]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:15 compute-1 sudo[74689]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:39:15 compute-1 sudo[74689]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:15 compute-1 sudo[74689]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:15 compute-1 sudo[74714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec 06 09:39:15 compute-1 sudo[74714]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:15 compute-1 sudo[74714]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:15 compute-1 sudo[74739]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 09:39:15 compute-1 sudo[74739]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:15 compute-1 sudo[74739]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:16 compute-1 sudo[74764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph
Dec 06 09:39:16 compute-1 sudo[74764]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:16 compute-1 sudo[74764]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:16 compute-1 sudo[74789]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:39:16 compute-1 sudo[74789]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:16 compute-1 sudo[74789]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:16 compute-1 sudo[74814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:39:16 compute-1 sudo[74814]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:16 compute-1 sudo[74814]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:16 compute-1 sudo[74839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:39:16 compute-1 sudo[74839]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:16 compute-1 sudo[74839]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:16 compute-1 sudo[74887]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:39:16 compute-1 sudo[74887]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:16 compute-1 sudo[74887]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:16 compute-1 sudo[74912]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:39:16 compute-1 sudo[74912]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:16 compute-1 sudo[74912]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:16 compute-1 sudo[74937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 09:39:16 compute-1 sudo[74937]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:16 compute-1 sudo[74937]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:16 compute-1 sudo[74962]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config
Dec 06 09:39:16 compute-1 sudo[74962]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:16 compute-1 sudo[74962]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:16 compute-1 sudo[74987]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config
Dec 06 09:39:16 compute-1 sudo[74987]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:16 compute-1 sudo[74987]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:16 compute-1 sudo[75012]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring.new
Dec 06 09:39:16 compute-1 sudo[75012]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:16 compute-1 sudo[75012]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:16 compute-1 sudo[75037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:39:16 compute-1 sudo[75037]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:16 compute-1 sudo[75037]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:16 compute-1 sudo[75062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring.new
Dec 06 09:39:16 compute-1 sudo[75062]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:16 compute-1 sudo[75062]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:17 compute-1 sudo[75110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring.new
Dec 06 09:39:17 compute-1 sudo[75110]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:17 compute-1 sudo[75110]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:17 compute-1 sudo[75135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring.new
Dec 06 09:39:17 compute-1 sudo[75135]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:17 compute-1 sudo[75135]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:17 compute-1 sudo[75160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring.new /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec 06 09:39:17 compute-1 sudo[75160]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:17 compute-1 sudo[75160]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:17 compute-1 sudo[75185]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:39:17 compute-1 sudo[75185]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:17 compute-1 sudo[75185]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:17 compute-1 sudo[75210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:39:17 compute-1 sudo[75210]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:17 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:39:17 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:39:17 compute-1 podman[75275]: 2025-12-06 09:39:17.83779864 +0000 UTC m=+0.071745646 container create ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_jones, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:39:17 compute-1 systemd[1]: Started libpod-conmon-ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d.scope.
Dec 06 09:39:17 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:39:17 compute-1 podman[75275]: 2025-12-06 09:39:17.809524735 +0000 UTC m=+0.043471801 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:17 compute-1 podman[75275]: 2025-12-06 09:39:17.920087089 +0000 UTC m=+0.154034065 container init ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_jones, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Dec 06 09:39:17 compute-1 podman[75275]: 2025-12-06 09:39:17.928003432 +0000 UTC m=+0.161950398 container start ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_jones, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:39:17 compute-1 podman[75275]: 2025-12-06 09:39:17.932911641 +0000 UTC m=+0.166858667 container attach ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_jones, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Dec 06 09:39:17 compute-1 great_jones[75292]: 167 167
Dec 06 09:39:17 compute-1 systemd[1]: libpod-ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d.scope: Deactivated successfully.
Dec 06 09:39:17 compute-1 podman[75275]: 2025-12-06 09:39:17.935407048 +0000 UTC m=+0.169354014 container died ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_jones, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Dec 06 09:39:17 compute-1 podman[75275]: 2025-12-06 09:39:17.977327514 +0000 UTC m=+0.211274490 container remove ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=great_jones, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True)
Dec 06 09:39:17 compute-1 systemd[1]: libpod-conmon-ae2f9b8500defe93bab38ac395c58893e56aeb86aecf732da6628f4a5337c11d.scope: Deactivated successfully.
Dec 06 09:39:18 compute-1 systemd[1]: Reloading.
Dec 06 09:39:18 compute-1 systemd-sysv-generator[75338]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:39:18 compute-1 systemd-rc-local-generator[75335]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:39:18 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:39:18 compute-1 systemd[1]: Reloading.
Dec 06 09:39:18 compute-1 systemd-rc-local-generator[75376]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:39:18 compute-1 systemd-sysv-generator[75380]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:39:18 compute-1 systemd[1]: Reached target All Ceph clusters and services.
Dec 06 09:39:18 compute-1 systemd[1]: Reloading.
Dec 06 09:39:18 compute-1 systemd-sysv-generator[75417]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:39:18 compute-1 systemd-rc-local-generator[75413]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:39:18 compute-1 systemd[1]: Reached target Ceph cluster 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:39:18 compute-1 systemd[1]: Reloading.
Dec 06 09:39:19 compute-1 systemd-rc-local-generator[75453]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:39:19 compute-1 systemd-sysv-generator[75457]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:39:19 compute-1 systemd[1]: Reloading.
Dec 06 09:39:19 compute-1 systemd-rc-local-generator[75495]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:39:19 compute-1 systemd-sysv-generator[75498]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:39:19 compute-1 systemd[1]: Created slice Slice /system/ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:39:19 compute-1 systemd[1]: Reached target System Time Set.
Dec 06 09:39:19 compute-1 systemd[1]: Reached target System Time Synchronized.
Dec 06 09:39:19 compute-1 systemd[1]: Starting Ceph crash.compute-1 for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:39:19 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:39:19 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 06 09:39:19 compute-1 podman[75552]: 2025-12-06 09:39:19.857758551 +0000 UTC m=+0.055401112 container create 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Dec 06 09:39:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e55095be0e98c93c69902ee386c80c8a804e8c3822ff70ec6394b1e9140badc8/merged/etc/ceph/ceph.client.crash.compute-1.keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e55095be0e98c93c69902ee386c80c8a804e8c3822ff70ec6394b1e9140badc8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e55095be0e98c93c69902ee386c80c8a804e8c3822ff70ec6394b1e9140badc8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:19 compute-1 podman[75552]: 2025-12-06 09:39:19.930565593 +0000 UTC m=+0.128208174 container init 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 06 09:39:19 compute-1 podman[75552]: 2025-12-06 09:39:19.837746831 +0000 UTC m=+0.035389422 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:19 compute-1 podman[75552]: 2025-12-06 09:39:19.941041364 +0000 UTC m=+0.138683925 container start 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Dec 06 09:39:19 compute-1 bash[75552]: 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0
Dec 06 09:39:19 compute-1 systemd[1]: Started Ceph crash.compute-1 for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:39:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: INFO:ceph-crash:pinging cluster to exercise our key
Dec 06 09:39:20 compute-1 sudo[75210]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: 2025-12-06T09:39:20.101+0000 7f5bc2f50640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec 06 09:39:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: 2025-12-06T09:39:20.101+0000 7f5bc2f50640 -1 AuthRegistry(0x7f5bbc0698f0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec 06 09:39:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: 2025-12-06T09:39:20.102+0000 7f5bc2f50640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Dec 06 09:39:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: 2025-12-06T09:39:20.102+0000 7f5bc2f50640 -1 AuthRegistry(0x7f5bc2f4eff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Dec 06 09:39:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: 2025-12-06T09:39:20.105+0000 7f5bc0cc5640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Dec 06 09:39:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: 2025-12-06T09:39:20.106+0000 7f5bc2f50640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Dec 06 09:39:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: [errno 13] RADOS permission denied (error connecting to the cluster)
Dec 06 09:39:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1[75567]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Dec 06 09:39:20 compute-1 sudo[75574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:39:20 compute-1 sudo[75574]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:20 compute-1 sudo[75574]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:20 compute-1 sudo[75609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 --yes --no-systemd
Dec 06 09:39:20 compute-1 sudo[75609]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:20 compute-1 podman[75674]: 2025-12-06 09:39:20.679575681 +0000 UTC m=+0.068553596 container create 67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_borg, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Dec 06 09:39:20 compute-1 systemd[1]: Started libpod-conmon-67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3.scope.
Dec 06 09:39:20 compute-1 podman[75674]: 2025-12-06 09:39:20.656986782 +0000 UTC m=+0.045964727 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:20 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:39:20 compute-1 podman[75674]: 2025-12-06 09:39:20.796891878 +0000 UTC m=+0.185869833 container init 67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_borg, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 06 09:39:20 compute-1 podman[75674]: 2025-12-06 09:39:20.808750197 +0000 UTC m=+0.197728152 container start 67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_borg, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec 06 09:39:20 compute-1 podman[75674]: 2025-12-06 09:39:20.813198611 +0000 UTC m=+0.202176536 container attach 67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_borg, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 06 09:39:20 compute-1 relaxed_borg[75690]: 167 167
Dec 06 09:39:20 compute-1 systemd[1]: libpod-67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3.scope: Deactivated successfully.
Dec 06 09:39:20 compute-1 conmon[75690]: conmon 67bdb980cdfa59ce4025 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3.scope/container/memory.events
Dec 06 09:39:20 compute-1 podman[75674]: 2025-12-06 09:39:20.818443761 +0000 UTC m=+0.207421686 container died 67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_borg, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default)
Dec 06 09:39:20 compute-1 systemd[1]: var-lib-containers-storage-overlay-20b1b99cb844c6e5bc7706cb2e0c22d9e4165de86258b013d82eda69c16555fb-merged.mount: Deactivated successfully.
Dec 06 09:39:20 compute-1 podman[75674]: 2025-12-06 09:39:20.87347854 +0000 UTC m=+0.262456455 container remove 67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=relaxed_borg, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Dec 06 09:39:20 compute-1 systemd[1]: libpod-conmon-67bdb980cdfa59ce40251524f957bace2877e08612d28f8836013f3113d57aa3.scope: Deactivated successfully.
Dec 06 09:39:21 compute-1 podman[75715]: 2025-12-06 09:39:21.06625668 +0000 UTC m=+0.056768970 container create 2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_clarke, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 09:39:21 compute-1 systemd[1]: Started libpod-conmon-2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92.scope.
Dec 06 09:39:21 compute-1 podman[75715]: 2025-12-06 09:39:21.037759787 +0000 UTC m=+0.028272157 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:21 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:39:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/837768d19173e8ce29e6225388bd01c3ba0dad9c01ad2b562e610b499719db21/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/837768d19173e8ce29e6225388bd01c3ba0dad9c01ad2b562e610b499719db21/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/837768d19173e8ce29e6225388bd01c3ba0dad9c01ad2b562e610b499719db21/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/837768d19173e8ce29e6225388bd01c3ba0dad9c01ad2b562e610b499719db21/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:21 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/837768d19173e8ce29e6225388bd01c3ba0dad9c01ad2b562e610b499719db21/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:21 compute-1 podman[75715]: 2025-12-06 09:39:21.163186843 +0000 UTC m=+0.153699233 container init 2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_clarke, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 06 09:39:21 compute-1 podman[75715]: 2025-12-06 09:39:21.173565541 +0000 UTC m=+0.164077831 container start 2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_clarke, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:39:21 compute-1 podman[75715]: 2025-12-06 09:39:21.178343256 +0000 UTC m=+0.168855606 container attach 2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_clarke, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 06 09:39:21 compute-1 festive_clarke[75731]: --> passed data devices: 0 physical, 1 LVM
Dec 06 09:39:21 compute-1 festive_clarke[75731]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 09:39:21 compute-1 festive_clarke[75731]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 09:39:21 compute-1 festive_clarke[75731]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new a01bc6a6-e368-4763-a10f-41794e4ef717
Dec 06 09:39:22 compute-1 festive_clarke[75731]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0
Dec 06 09:39:22 compute-1 festive_clarke[75731]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Dec 06 09:39:22 compute-1 festive_clarke[75731]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 06 09:39:22 compute-1 festive_clarke[75731]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:22 compute-1 festive_clarke[75731]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap
Dec 06 09:39:22 compute-1 lvm[75794]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 09:39:22 compute-1 lvm[75794]: VG ceph_vg0 finished
Dec 06 09:39:22 compute-1 festive_clarke[75731]:  stderr: got monmap epoch 1
Dec 06 09:39:22 compute-1 festive_clarke[75731]: --> Creating keyring file for osd.0
Dec 06 09:39:22 compute-1 festive_clarke[75731]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring
Dec 06 09:39:22 compute-1 festive_clarke[75731]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/
Dec 06 09:39:22 compute-1 festive_clarke[75731]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid a01bc6a6-e368-4763-a10f-41794e4ef717 --setuser ceph --setgroup ceph
Dec 06 09:39:25 compute-1 festive_clarke[75731]:  stderr: 2025-12-06T09:39:22.973+0000 7f63b388f740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) No valid bdev label found
Dec 06 09:39:25 compute-1 festive_clarke[75731]:  stderr: 2025-12-06T09:39:23.242+0000 7f63b388f740 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid
Dec 06 09:39:25 compute-1 festive_clarke[75731]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Dec 06 09:39:26 compute-1 festive_clarke[75731]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 06 09:39:26 compute-1 festive_clarke[75731]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Dec 06 09:39:26 compute-1 festive_clarke[75731]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:26 compute-1 festive_clarke[75731]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:26 compute-1 festive_clarke[75731]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 06 09:39:26 compute-1 festive_clarke[75731]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 06 09:39:26 compute-1 festive_clarke[75731]: --> ceph-volume lvm activate successful for osd ID: 0
Dec 06 09:39:26 compute-1 festive_clarke[75731]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Dec 06 09:39:26 compute-1 systemd[1]: libpod-2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92.scope: Deactivated successfully.
Dec 06 09:39:26 compute-1 systemd[1]: libpod-2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92.scope: Consumed 2.267s CPU time.
Dec 06 09:39:26 compute-1 podman[76696]: 2025-12-06 09:39:26.483626628 +0000 UTC m=+0.049970705 container died 2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_clarke, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 06 09:39:26 compute-1 systemd[1]: var-lib-containers-storage-overlay-837768d19173e8ce29e6225388bd01c3ba0dad9c01ad2b562e610b499719db21-merged.mount: Deactivated successfully.
Dec 06 09:39:26 compute-1 podman[76696]: 2025-12-06 09:39:26.527655266 +0000 UTC m=+0.093999313 container remove 2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_clarke, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Dec 06 09:39:26 compute-1 systemd[1]: libpod-conmon-2d59b1abb6cfc7b59f0ce21f44bba1626d59a9b342d84f2aeea702cab76cab92.scope: Deactivated successfully.
Dec 06 09:39:26 compute-1 sudo[75609]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:26 compute-1 sudo[76711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:39:26 compute-1 sudo[76711]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:26 compute-1 sudo[76711]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:26 compute-1 sudo[76736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258 -- lvm list --format json
Dec 06 09:39:26 compute-1 sudo[76736]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:27 compute-1 podman[76800]: 2025-12-06 09:39:27.179532324 +0000 UTC m=+0.052658968 container create 6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_shockley, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Dec 06 09:39:27 compute-1 systemd[1]: Started libpod-conmon-6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16.scope.
Dec 06 09:39:27 compute-1 podman[76800]: 2025-12-06 09:39:27.159218663 +0000 UTC m=+0.032345297 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:27 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:39:27 compute-1 podman[76800]: 2025-12-06 09:39:27.293847397 +0000 UTC m=+0.166974041 container init 6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_shockley, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 09:39:27 compute-1 podman[76800]: 2025-12-06 09:39:27.303874643 +0000 UTC m=+0.177001287 container start 6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_shockley, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Dec 06 09:39:27 compute-1 podman[76800]: 2025-12-06 09:39:27.308511063 +0000 UTC m=+0.181637707 container attach 6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_shockley, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 09:39:27 compute-1 fervent_shockley[76816]: 167 167
Dec 06 09:39:27 compute-1 systemd[1]: libpod-6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16.scope: Deactivated successfully.
Dec 06 09:39:27 compute-1 podman[76800]: 2025-12-06 09:39:27.312430738 +0000 UTC m=+0.185557382 container died 6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_shockley, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec 06 09:39:27 compute-1 systemd[1]: var-lib-containers-storage-overlay-37eb6bb8d3725a7677d32a1e7a38351260f1bc45e905ced346ddf387b6b8d75a-merged.mount: Deactivated successfully.
Dec 06 09:39:27 compute-1 podman[76800]: 2025-12-06 09:39:27.36465187 +0000 UTC m=+0.237778474 container remove 6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_shockley, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 06 09:39:27 compute-1 systemd[1]: libpod-conmon-6fe09e9dc68e97efb9be36e4093497816872a51599b7f384c45bc6215455cb16.scope: Deactivated successfully.
Dec 06 09:39:27 compute-1 podman[76839]: 2025-12-06 09:39:27.568817973 +0000 UTC m=+0.060288971 container create 2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_noyce, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Dec 06 09:39:27 compute-1 systemd[1]: Started libpod-conmon-2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238.scope.
Dec 06 09:39:27 compute-1 podman[76839]: 2025-12-06 09:39:27.540086321 +0000 UTC m=+0.031557349 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:27 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:39:27 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8f587168e5728a14981cb5200db2fa6d4c36db187778d06fea6675aa383effb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:27 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8f587168e5728a14981cb5200db2fa6d4c36db187778d06fea6675aa383effb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:27 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8f587168e5728a14981cb5200db2fa6d4c36db187778d06fea6675aa383effb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:27 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8f587168e5728a14981cb5200db2fa6d4c36db187778d06fea6675aa383effb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:27 compute-1 podman[76839]: 2025-12-06 09:39:27.677613926 +0000 UTC m=+0.169084904 container init 2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_noyce, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:39:27 compute-1 podman[76839]: 2025-12-06 09:39:27.688674247 +0000 UTC m=+0.180145205 container start 2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_noyce, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, CEPH_REF=squid)
Dec 06 09:39:27 compute-1 podman[76839]: 2025-12-06 09:39:27.692564831 +0000 UTC m=+0.184035789 container attach 2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_noyce, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 06 09:39:27 compute-1 objective_noyce[76856]: {
Dec 06 09:39:27 compute-1 objective_noyce[76856]:     "0": [
Dec 06 09:39:28 compute-1 objective_noyce[76856]:         {
Dec 06 09:39:28 compute-1 objective_noyce[76856]:             "devices": [
Dec 06 09:39:28 compute-1 objective_noyce[76856]:                 "/dev/loop3"
Dec 06 09:39:28 compute-1 objective_noyce[76856]:             ],
Dec 06 09:39:28 compute-1 objective_noyce[76856]:             "lv_name": "ceph_lv0",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:             "lv_size": "21470642176",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=vDR22O-WywQ-swhh-zHBg-feef-qNj1-Dqh00z,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=5ecd3f74-dade-5fc4-92ce-8950ae424258,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=a01bc6a6-e368-4763-a10f-41794e4ef717,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:             "lv_uuid": "vDR22O-WywQ-swhh-zHBg-feef-qNj1-Dqh00z",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:             "name": "ceph_lv0",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:             "path": "/dev/ceph_vg0/ceph_lv0",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:             "tags": {
Dec 06 09:39:28 compute-1 objective_noyce[76856]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:                 "ceph.block_uuid": "vDR22O-WywQ-swhh-zHBg-feef-qNj1-Dqh00z",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:                 "ceph.cephx_lockbox_secret": "",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:                 "ceph.cluster_fsid": "5ecd3f74-dade-5fc4-92ce-8950ae424258",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:                 "ceph.cluster_name": "ceph",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:                 "ceph.crush_device_class": "",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:                 "ceph.encrypted": "0",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:                 "ceph.osd_fsid": "a01bc6a6-e368-4763-a10f-41794e4ef717",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:                 "ceph.osd_id": "0",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:                 "ceph.osdspec_affinity": "default_drive_group",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:                 "ceph.type": "block",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:                 "ceph.vdo": "0",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:                 "ceph.with_tpm": "0"
Dec 06 09:39:28 compute-1 objective_noyce[76856]:             },
Dec 06 09:39:28 compute-1 objective_noyce[76856]:             "type": "block",
Dec 06 09:39:28 compute-1 objective_noyce[76856]:             "vg_name": "ceph_vg0"
Dec 06 09:39:28 compute-1 objective_noyce[76856]:         }
Dec 06 09:39:28 compute-1 objective_noyce[76856]:     ]
Dec 06 09:39:28 compute-1 objective_noyce[76856]: }
Dec 06 09:39:28 compute-1 systemd[1]: libpod-2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238.scope: Deactivated successfully.
Dec 06 09:39:28 compute-1 podman[76839]: 2025-12-06 09:39:28.044900046 +0000 UTC m=+0.536371054 container died 2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_noyce, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:39:28 compute-1 systemd[1]: var-lib-containers-storage-overlay-d8f587168e5728a14981cb5200db2fa6d4c36db187778d06fea6675aa383effb-merged.mount: Deactivated successfully.
Dec 06 09:39:28 compute-1 podman[76839]: 2025-12-06 09:39:28.104867264 +0000 UTC m=+0.596338262 container remove 2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=objective_noyce, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 06 09:39:28 compute-1 systemd[1]: libpod-conmon-2801b7c0390a8d7af79d9e28e1e2c2d9cf7f13678710e708f19be67b5d219238.scope: Deactivated successfully.
Dec 06 09:39:28 compute-1 sudo[76736]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:28 compute-1 sudo[76878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:39:28 compute-1 sudo[76878]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:28 compute-1 sudo[76878]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:28 compute-1 sudo[76903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:39:28 compute-1 sudo[76903]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:28 compute-1 podman[76971]: 2025-12-06 09:39:28.834169411 +0000 UTC m=+0.052312125 container create dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_jepsen, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 06 09:39:28 compute-1 systemd[1]: Started libpod-conmon-dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297.scope.
Dec 06 09:39:28 compute-1 podman[76971]: 2025-12-06 09:39:28.810253436 +0000 UTC m=+0.028396170 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:28 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:39:28 compute-1 podman[76971]: 2025-12-06 09:39:28.934890736 +0000 UTC m=+0.153033490 container init dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_jepsen, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:39:28 compute-1 podman[76971]: 2025-12-06 09:39:28.945378098 +0000 UTC m=+0.163520832 container start dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_jepsen, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:39:28 compute-1 podman[76971]: 2025-12-06 09:39:28.949360015 +0000 UTC m=+0.167502719 container attach dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_jepsen, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:39:28 compute-1 eager_jepsen[76988]: 167 167
Dec 06 09:39:28 compute-1 systemd[1]: libpod-dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297.scope: Deactivated successfully.
Dec 06 09:39:28 compute-1 podman[76971]: 2025-12-06 09:39:28.953294601 +0000 UTC m=+0.171437305 container died dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_jepsen, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:39:28 compute-1 systemd[1]: var-lib-containers-storage-overlay-77326abbc5f0f3173a79671c7b1c210057b0a35127159252d53e06946da7f835-merged.mount: Deactivated successfully.
Dec 06 09:39:28 compute-1 podman[76971]: 2025-12-06 09:39:28.998078186 +0000 UTC m=+0.216220900 container remove dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=eager_jepsen, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Dec 06 09:39:29 compute-1 systemd[1]: libpod-conmon-dac94e84a5656c2b8c3f26ce2d3a4758c51f1560760ac45906fa8be5de884297.scope: Deactivated successfully.
Dec 06 09:39:29 compute-1 podman[77017]: 2025-12-06 09:39:29.255263048 +0000 UTC m=+0.054465700 container create 7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 09:39:29 compute-1 systemd[1]: Started libpod-conmon-7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88.scope.
Dec 06 09:39:29 compute-1 podman[77017]: 2025-12-06 09:39:29.22691076 +0000 UTC m=+0.026113512 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:29 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:39:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5266eb2440c04747a6e446e4b53961da48954ef17a86adae1b17a8652f6bf5c9/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5266eb2440c04747a6e446e4b53961da48954ef17a86adae1b17a8652f6bf5c9/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5266eb2440c04747a6e446e4b53961da48954ef17a86adae1b17a8652f6bf5c9/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5266eb2440c04747a6e446e4b53961da48954ef17a86adae1b17a8652f6bf5c9/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:29 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5266eb2440c04747a6e446e4b53961da48954ef17a86adae1b17a8652f6bf5c9/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:29 compute-1 podman[77017]: 2025-12-06 09:39:29.385606694 +0000 UTC m=+0.184809366 container init 7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 06 09:39:29 compute-1 podman[77017]: 2025-12-06 09:39:29.397307498 +0000 UTC m=+0.196510160 container start 7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Dec 06 09:39:29 compute-1 podman[77017]: 2025-12-06 09:39:29.401500852 +0000 UTC m=+0.200703554 container attach 7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 06 09:39:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test[77034]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Dec 06 09:39:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test[77034]:                             [--no-systemd] [--no-tmpfs]
Dec 06 09:39:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test[77034]: ceph-volume activate: error: unrecognized arguments: --bad-option
Dec 06 09:39:29 compute-1 systemd[1]: libpod-7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88.scope: Deactivated successfully.
Dec 06 09:39:29 compute-1 podman[77017]: 2025-12-06 09:39:29.574223241 +0000 UTC m=+0.373425923 container died 7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 09:39:29 compute-1 systemd[1]: var-lib-containers-storage-overlay-5266eb2440c04747a6e446e4b53961da48954ef17a86adae1b17a8652f6bf5c9-merged.mount: Deactivated successfully.
Dec 06 09:39:29 compute-1 podman[77017]: 2025-12-06 09:39:29.62609644 +0000 UTC m=+0.425299092 container remove 7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:39:29 compute-1 systemd[1]: libpod-conmon-7d4cee0917fa854568a384399dea22bf6ded8f296932a2db4c62433b832cdb88.scope: Deactivated successfully.
Dec 06 09:39:29 compute-1 systemd[1]: Reloading.
Dec 06 09:39:30 compute-1 systemd-rc-local-generator[77096]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:39:30 compute-1 systemd-sysv-generator[77099]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:39:30 compute-1 systemd[1]: Reloading.
Dec 06 09:39:30 compute-1 systemd-sysv-generator[77138]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:39:30 compute-1 systemd-rc-local-generator[77133]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:39:30 compute-1 systemd[1]: Starting Ceph osd.0 for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:39:30 compute-1 podman[77194]: 2025-12-06 09:39:30.706403867 +0000 UTC m=+0.038481479 container create e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Dec 06 09:39:30 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:39:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9a96d6d8008c97dcb54d61cc909df3f92a9e8ac260d827abc35f0239fa3c1e5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9a96d6d8008c97dcb54d61cc909df3f92a9e8ac260d827abc35f0239fa3c1e5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9a96d6d8008c97dcb54d61cc909df3f92a9e8ac260d827abc35f0239fa3c1e5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9a96d6d8008c97dcb54d61cc909df3f92a9e8ac260d827abc35f0239fa3c1e5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d9a96d6d8008c97dcb54d61cc909df3f92a9e8ac260d827abc35f0239fa3c1e5/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:30 compute-1 podman[77194]: 2025-12-06 09:39:30.690293691 +0000 UTC m=+0.022371323 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:30 compute-1 podman[77194]: 2025-12-06 09:39:30.792224907 +0000 UTC m=+0.124302559 container init e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Dec 06 09:39:30 compute-1 podman[77194]: 2025-12-06 09:39:30.803515747 +0000 UTC m=+0.135593379 container start e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 06 09:39:30 compute-1 podman[77194]: 2025-12-06 09:39:30.852401143 +0000 UTC m=+0.184478855 container attach e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 06 09:39:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 09:39:30 compute-1 bash[77194]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 09:39:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 09:39:30 compute-1 bash[77194]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 09:39:31 compute-1 lvm[77290]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 09:39:31 compute-1 lvm[77290]: VG ceph_vg0 finished
Dec 06 09:39:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec 06 09:39:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 09:39:31 compute-1 bash[77194]: --> Failed to activate via raw: did not find any matching OSD to activate
Dec 06 09:39:31 compute-1 bash[77194]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 09:39:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 09:39:31 compute-1 bash[77194]: Running command: /usr/bin/ceph-authtool --gen-print-key
Dec 06 09:39:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 06 09:39:31 compute-1 bash[77194]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 06 09:39:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Dec 06 09:39:31 compute-1 bash[77194]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Dec 06 09:39:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:32 compute-1 bash[77194]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:32 compute-1 bash[77194]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 06 09:39:32 compute-1 bash[77194]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Dec 06 09:39:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 06 09:39:32 compute-1 bash[77194]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Dec 06 09:39:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate[77209]: --> ceph-volume lvm activate successful for osd ID: 0
Dec 06 09:39:32 compute-1 bash[77194]: --> ceph-volume lvm activate successful for osd ID: 0
Dec 06 09:39:32 compute-1 systemd[1]: libpod-e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019.scope: Deactivated successfully.
Dec 06 09:39:32 compute-1 systemd[1]: libpod-e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019.scope: Consumed 1.597s CPU time.
Dec 06 09:39:32 compute-1 podman[77384]: 2025-12-06 09:39:32.273619698 +0000 UTC m=+0.030391309 container died e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1)
Dec 06 09:39:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-d9a96d6d8008c97dcb54d61cc909df3f92a9e8ac260d827abc35f0239fa3c1e5-merged.mount: Deactivated successfully.
Dec 06 09:39:32 compute-1 podman[77384]: 2025-12-06 09:39:32.357317396 +0000 UTC m=+0.114089037 container remove e6e4efa68513d96190cc0bb10d531d240c0cb225b8cb4d9f4d845af927a63019 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 06 09:39:32 compute-1 podman[77446]: 2025-12-06 09:39:32.628229591 +0000 UTC m=+0.059621688 container create 0f0393491dd03b5ae266c2a248287651acc39819e0ff2bca59276136b4944860 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:39:32 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73a9c203952191e0f9007cfcac9613bd129315c5989eb5abbe47ac3581205d5f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:32 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73a9c203952191e0f9007cfcac9613bd129315c5989eb5abbe47ac3581205d5f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:32 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73a9c203952191e0f9007cfcac9613bd129315c5989eb5abbe47ac3581205d5f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:32 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73a9c203952191e0f9007cfcac9613bd129315c5989eb5abbe47ac3581205d5f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:32 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73a9c203952191e0f9007cfcac9613bd129315c5989eb5abbe47ac3581205d5f/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:32 compute-1 podman[77446]: 2025-12-06 09:39:32.596921761 +0000 UTC m=+0.028313848 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:32 compute-1 podman[77446]: 2025-12-06 09:39:32.71631965 +0000 UTC m=+0.147711737 container init 0f0393491dd03b5ae266c2a248287651acc39819e0ff2bca59276136b4944860 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Dec 06 09:39:32 compute-1 podman[77446]: 2025-12-06 09:39:32.723672224 +0000 UTC m=+0.155064321 container start 0f0393491dd03b5ae266c2a248287651acc39819e0ff2bca59276136b4944860 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Dec 06 09:39:32 compute-1 bash[77446]: 0f0393491dd03b5ae266c2a248287651acc39819e0ff2bca59276136b4944860
Dec 06 09:39:32 compute-1 systemd[1]: Started Ceph osd.0 for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:39:32 compute-1 ceph-osd[77465]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 09:39:32 compute-1 ceph-osd[77465]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Dec 06 09:39:32 compute-1 ceph-osd[77465]: pidfile_write: ignore empty --pid-file
Dec 06 09:39:32 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:32 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 06 09:39:32 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 06 09:39:32 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 09:39:32 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) close
Dec 06 09:39:32 compute-1 sudo[76903]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:32 compute-1 sudo[77477]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:39:32 compute-1 sudo[77477]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:32 compute-1 sudo[77477]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:32 compute-1 sudo[77502]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258 -- raw list --format json
Dec 06 09:39:32 compute-1 sudo[77502]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) close
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) close
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) close
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) close
Dec 06 09:39:33 compute-1 podman[77575]: 2025-12-06 09:39:33.496517764 +0000 UTC m=+0.117992672 container create 3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_cannon, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 09:39:33 compute-1 podman[77575]: 2025-12-06 09:39:33.405337438 +0000 UTC m=+0.026812386 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:33 compute-1 systemd[1]: Started libpod-conmon-3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d.scope.
Dec 06 09:39:33 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:39:33 compute-1 podman[77575]: 2025-12-06 09:39:33.599813437 +0000 UTC m=+0.221288425 container init 3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_cannon, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:39:33 compute-1 podman[77575]: 2025-12-06 09:39:33.612581067 +0000 UTC m=+0.234056015 container start 3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_cannon, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:39:33 compute-1 podman[77575]: 2025-12-06 09:39:33.616402779 +0000 UTC m=+0.237877707 container attach 3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_cannon, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Dec 06 09:39:33 compute-1 clever_cannon[77594]: 167 167
Dec 06 09:39:33 compute-1 systemd[1]: libpod-3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d.scope: Deactivated successfully.
Dec 06 09:39:33 compute-1 conmon[77594]: conmon 3a9a52a6044f2e3916cd <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d.scope/container/memory.events
Dec 06 09:39:33 compute-1 podman[77575]: 2025-12-06 09:39:33.623622948 +0000 UTC m=+0.245097876 container died 3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_cannon, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Dec 06 09:39:33 compute-1 systemd[1]: var-lib-containers-storage-overlay-c0d084adbb910e2d84183d32f9845872f41323414d21fe37ecb727234b4c5dc6-merged.mount: Deactivated successfully.
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739c00 /var/lib/ceph/osd/ceph-0/block) close
Dec 06 09:39:33 compute-1 podman[77575]: 2025-12-06 09:39:33.796233523 +0000 UTC m=+0.417708441 container remove 3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=clever_cannon, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Dec 06 09:39:33 compute-1 systemd[1]: libpod-conmon-3a9a52a6044f2e3916cd1722e40a7034868324e9773461be98fa4b31ea66220d.scope: Deactivated successfully.
Dec 06 09:39:33 compute-1 ceph-osd[77465]: bdev(0x55fb22739800 /var/lib/ceph/osd/ceph-0/block) close
Dec 06 09:39:34 compute-1 podman[77622]: 2025-12-06 09:39:33.996526262 +0000 UTC m=+0.029263170 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:34 compute-1 ceph-osd[77465]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal
Dec 06 09:39:34 compute-1 ceph-osd[77465]: load: jerasure load: lrc 
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) close
Dec 06 09:39:34 compute-1 podman[77622]: 2025-12-06 09:39:34.285239743 +0000 UTC m=+0.317976631 container create 6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_buck, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Dec 06 09:39:34 compute-1 systemd[1]: Started libpod-conmon-6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a.scope.
Dec 06 09:39:34 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:39:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88c6814e6735576501c510f4f224d0b84b986d97970aa23c05b39f2d6f54f09a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88c6814e6735576501c510f4f224d0b84b986d97970aa23c05b39f2d6f54f09a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88c6814e6735576501c510f4f224d0b84b986d97970aa23c05b39f2d6f54f09a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88c6814e6735576501c510f4f224d0b84b986d97970aa23c05b39f2d6f54f09a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) close
Dec 06 09:39:34 compute-1 podman[77622]: 2025-12-06 09:39:34.599388059 +0000 UTC m=+0.632125037 container init 6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_buck, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default)
Dec 06 09:39:34 compute-1 podman[77622]: 2025-12-06 09:39:34.615748563 +0000 UTC m=+0.648485481 container start 6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_buck, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 09:39:34 compute-1 podman[77622]: 2025-12-06 09:39:34.621482761 +0000 UTC m=+0.654219659 container attach 6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_buck, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Dec 06 09:39:34 compute-1 ceph-osd[77465]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Dec 06 09:39:34 compute-1 ceph-osd[77465]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 09:39:34 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) close
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) close
Dec 06 09:39:35 compute-1 lvm[77732]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 09:39:35 compute-1 lvm[77732]: VG ceph_vg0 finished
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) close
Dec 06 09:39:35 compute-1 lvm[77737]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 09:39:35 compute-1 lvm[77737]: VG ceph_vg0 finished
Dec 06 09:39:35 compute-1 festive_buck[77643]: {}
Dec 06 09:39:35 compute-1 systemd[1]: libpod-6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a.scope: Deactivated successfully.
Dec 06 09:39:35 compute-1 systemd[1]: libpod-6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a.scope: Consumed 1.278s CPU time.
Dec 06 09:39:35 compute-1 podman[77622]: 2025-12-06 09:39:35.421787138 +0000 UTC m=+1.454524016 container died 6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_buck, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Dec 06 09:39:35 compute-1 systemd[1]: var-lib-containers-storage-overlay-88c6814e6735576501c510f4f224d0b84b986d97970aa23c05b39f2d6f54f09a-merged.mount: Deactivated successfully.
Dec 06 09:39:35 compute-1 podman[77622]: 2025-12-06 09:39:35.480253404 +0000 UTC m=+1.512990282 container remove 6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=festive_buck, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:39:35 compute-1 systemd[1]: libpod-conmon-6d886781c1c78dea24fdd24bd130f18ff9f5e88c380174c752ec15b7e1c5a59a.scope: Deactivated successfully.
Dec 06 09:39:35 compute-1 sudo[77502]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d4c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs mount
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs mount shared_bdev_used = 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: RocksDB version: 7.9.2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Git sha 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Compile date 2025-07-17 03:12:14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: DB SUMMARY
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: DB Session ID:  FMGV1GCT5BLWAHJBE977
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: CURRENT file:  CURRENT
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: IDENTITY file:  IDENTITY
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                         Options.error_if_exists: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.create_if_missing: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                         Options.paranoid_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                                     Options.env: 0x55fb235a5dc0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                                Options.info_log: 0x55fb235a97a0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_file_opening_threads: 16
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                              Options.statistics: (nil)
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.use_fsync: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.max_log_file_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                         Options.allow_fallocate: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.use_direct_reads: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.create_missing_column_families: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                              Options.db_log_dir: 
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                                 Options.wal_dir: db.wal
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.advise_random_on_open: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.write_buffer_manager: 0x55fb2369ea00
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                            Options.rate_limiter: (nil)
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.unordered_write: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.row_cache: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                              Options.wal_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.allow_ingest_behind: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.two_write_queues: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.manual_wal_flush: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.wal_compression: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.atomic_flush: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.log_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.allow_data_in_errors: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.db_host_id: __hostname__
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.max_background_jobs: 4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.max_background_compactions: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.max_subcompactions: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.max_open_files: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.bytes_per_sync: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.max_background_flushes: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Compression algorithms supported:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kZSTD supported: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kXpressCompression supported: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kBZip2Compression supported: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kLZ4Compression supported: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kZlibCompression supported: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kLZ4HCCompression supported: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kSnappyCompression supported: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227cf350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227cf350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227cf350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227cf350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227cf350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227cf350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227cf350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227ce9b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227ce9b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9b80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227ce9b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: af557f43-9483-4a65-96a9-1d3a8a4b0b2d
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765013975639798, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765013975640237, "job": 1, "event": "recovery_finished"}
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: freelist init
Dec 06 09:39:35 compute-1 ceph-osd[77465]: freelist _read_cfg
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs umount
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) close
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bdev(0x55fb235d5000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs mount
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluefs mount shared_bdev_used = 4718592
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: RocksDB version: 7.9.2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Git sha 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Compile date 2025-07-17 03:12:14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: DB SUMMARY
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: DB Session ID:  FMGV1GCT5BLWAHJBE976
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: CURRENT file:  CURRENT
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: IDENTITY file:  IDENTITY
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                         Options.error_if_exists: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.create_if_missing: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                         Options.paranoid_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                                     Options.env: 0x55fb23742310
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                                      Options.fs: LegacyFileSystem
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                                Options.info_log: 0x55fb2287c7c0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_file_opening_threads: 16
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                              Options.statistics: (nil)
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.use_fsync: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.max_log_file_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                         Options.allow_fallocate: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.use_direct_reads: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.create_missing_column_families: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                              Options.db_log_dir: 
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                                 Options.wal_dir: db.wal
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.advise_random_on_open: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.write_buffer_manager: 0x55fb2369ea00
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                            Options.rate_limiter: (nil)
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.unordered_write: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.row_cache: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                              Options.wal_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.allow_ingest_behind: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.two_write_queues: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.manual_wal_flush: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.wal_compression: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.atomic_flush: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.log_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.allow_data_in_errors: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.db_host_id: __hostname__
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.max_background_jobs: 4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.max_background_compactions: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.max_subcompactions: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.writable_file_max_buffer_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.max_total_wal_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.max_open_files: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.bytes_per_sync: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.compaction_readahead_size: 2097152
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.max_background_flushes: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Compression algorithms supported:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kZSTD supported: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kXpressCompression supported: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kBZip2Compression supported: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kLZ4Compression supported: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kZlibCompression supported: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kLZ4HCCompression supported: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         kSnappyCompression supported: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227cf350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227cf350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227cf350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227cf350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227cf350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227cf350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227cf350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9ac0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227ce9b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9ac0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227ce9b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:           Options.merge_operator: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fb235a9ac0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fb227ce9b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.write_buffer_size: 16777216
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.max_write_buffer_number: 64
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.compression: LZ4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.num_levels: 7
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: af557f43-9483-4a65-96a9-1d3a8a4b0b2d
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765013975892825, "job": 1, "event": "recovery_started", "wal_files": [31]}
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765013975906245, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765013975, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "af557f43-9483-4a65-96a9-1d3a8a4b0b2d", "db_session_id": "FMGV1GCT5BLWAHJBE976", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765013975909947, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765013975, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "af557f43-9483-4a65-96a9-1d3a8a4b0b2d", "db_session_id": "FMGV1GCT5BLWAHJBE976", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765013975913041, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765013975, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "af557f43-9483-4a65-96a9-1d3a8a4b0b2d", "db_session_id": "FMGV1GCT5BLWAHJBE976", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765013975915119, "job": 1, "event": "recovery_finished"}
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55fb23782000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: DB pointer 0x55fb23750000
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4
Dec 06 09:39:35 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:39:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.1 total, 0.1 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 06 09:39:35 compute-1 ceph-osd[77465]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Dec 06 09:39:35 compute-1 ceph-osd[77465]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Dec 06 09:39:35 compute-1 ceph-osd[77465]: _get_class not permitted to load lua
Dec 06 09:39:35 compute-1 ceph-osd[77465]: _get_class not permitted to load sdk
Dec 06 09:39:35 compute-1 ceph-osd[77465]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Dec 06 09:39:35 compute-1 ceph-osd[77465]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Dec 06 09:39:35 compute-1 ceph-osd[77465]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Dec 06 09:39:35 compute-1 ceph-osd[77465]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Dec 06 09:39:35 compute-1 ceph-osd[77465]: osd.0 0 load_pgs
Dec 06 09:39:35 compute-1 ceph-osd[77465]: osd.0 0 load_pgs opened 0 pgs
Dec 06 09:39:35 compute-1 ceph-osd[77465]: osd.0 0 log_to_monitors true
Dec 06 09:39:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0[77461]: 2025-12-06T09:39:35.971+0000 7fe795721740 -1 osd.0 0 log_to_monitors true
Dec 06 09:39:37 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Dec 06 09:39:37 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Dec 06 09:39:37 compute-1 ceph-osd[77465]: osd.0 0 done with init, starting boot process
Dec 06 09:39:37 compute-1 ceph-osd[77465]: osd.0 0 start_boot
Dec 06 09:39:37 compute-1 ceph-osd[77465]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1
Dec 06 09:39:37 compute-1 ceph-osd[77465]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Dec 06 09:39:37 compute-1 ceph-osd[77465]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Dec 06 09:39:37 compute-1 ceph-osd[77465]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Dec 06 09:39:37 compute-1 ceph-osd[77465]: osd.0 0  bench count 12288000 bsize 4 KiB
Dec 06 09:39:37 compute-1 sudo[78152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:39:37 compute-1 sudo[78152]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:37 compute-1 sudo[78152]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:38 compute-1 sudo[78177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:39:38 compute-1 sudo[78177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:38 compute-1 sudo[78177]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:38 compute-1 sudo[78202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 06 09:39:38 compute-1 sudo[78202]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:39 compute-1 podman[78299]: 2025-12-06 09:39:39.031501408 +0000 UTC m=+0.157191203 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 06 09:39:39 compute-1 podman[78299]: 2025-12-06 09:39:39.142619231 +0000 UTC m=+0.268309006 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Dec 06 09:39:39 compute-1 sudo[78202]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:39 compute-1 sudo[78350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:39:39 compute-1 sudo[78350]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:39 compute-1 sudo[78350]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:39 compute-1 sudo[78375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258 -- inventory --format=json-pretty --filter-for-batch
Dec 06 09:39:39 compute-1 sudo[78375]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:39:40 compute-1 podman[78438]: 2025-12-06 09:39:39.996092002 +0000 UTC m=+0.042356762 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:40 compute-1 podman[78438]: 2025-12-06 09:39:40.101692155 +0000 UTC m=+0.147956905 container create aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_mendeleev, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 09:39:40 compute-1 systemd[1]: Started libpod-conmon-aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e.scope.
Dec 06 09:39:40 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:39:40 compute-1 podman[78438]: 2025-12-06 09:39:40.349778403 +0000 UTC m=+0.396043183 container init aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_mendeleev, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 06 09:39:40 compute-1 podman[78438]: 2025-12-06 09:39:40.358050648 +0000 UTC m=+0.404315388 container start aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_mendeleev, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid)
Dec 06 09:39:40 compute-1 silly_mendeleev[78455]: 167 167
Dec 06 09:39:40 compute-1 systemd[1]: libpod-aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e.scope: Deactivated successfully.
Dec 06 09:39:40 compute-1 podman[78438]: 2025-12-06 09:39:40.399458007 +0000 UTC m=+0.445722747 container attach aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_mendeleev, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Dec 06 09:39:40 compute-1 podman[78438]: 2025-12-06 09:39:40.400765942 +0000 UTC m=+0.447030682 container died aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_mendeleev, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Dec 06 09:39:40 compute-1 systemd[1]: var-lib-containers-storage-overlay-6111ee47524b5b6dc41ae8018fa81ea753c7e0a5fff66832be6e2713d62aaa7b-merged.mount: Deactivated successfully.
Dec 06 09:39:40 compute-1 podman[78438]: 2025-12-06 09:39:40.662615585 +0000 UTC m=+0.708880365 container remove aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=silly_mendeleev, ceph=True, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:39:40 compute-1 systemd[1]: libpod-conmon-aee8c6e29802d7f938eca68cd018362d6e0ea303b8d42e7acf66f57a6f5b663e.scope: Deactivated successfully.
Dec 06 09:39:40 compute-1 podman[78480]: 2025-12-06 09:39:40.902977566 +0000 UTC m=+0.083243142 container create 4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_rosalind, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Dec 06 09:39:40 compute-1 podman[78480]: 2025-12-06 09:39:40.864671105 +0000 UTC m=+0.044936661 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:39:41 compute-1 systemd[1]: Started libpod-conmon-4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51.scope.
Dec 06 09:39:41 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:39:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4464c5e5686d44a2ee9eeb629d19da23d71a23cdb9011766c7475be6581075d6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4464c5e5686d44a2ee9eeb629d19da23d71a23cdb9011766c7475be6581075d6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4464c5e5686d44a2ee9eeb629d19da23d71a23cdb9011766c7475be6581075d6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4464c5e5686d44a2ee9eeb629d19da23d71a23cdb9011766c7475be6581075d6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 09:39:41 compute-1 podman[78480]: 2025-12-06 09:39:41.355103873 +0000 UTC m=+0.535369429 container init 4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_rosalind, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:39:41 compute-1 podman[78480]: 2025-12-06 09:39:41.362583251 +0000 UTC m=+0.542848787 container start 4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_rosalind, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 09:39:41 compute-1 podman[78480]: 2025-12-06 09:39:41.439760033 +0000 UTC m=+0.620025569 container attach 4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_rosalind, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:39:42 compute-1 kind_rosalind[78496]: [
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:     {
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:         "available": false,
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:         "being_replaced": false,
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:         "ceph_device_lvm": false,
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:         "lsm_data": {},
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:         "lvs": [],
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:         "path": "/dev/sr0",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:         "rejected_reasons": [
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "Has a FileSystem",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "Insufficient space (<5GB)"
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:         ],
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:         "sys_api": {
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "actuators": null,
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "device_nodes": [
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:                 "sr0"
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             ],
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "devname": "sr0",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "human_readable_size": "482.00 KB",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "id_bus": "ata",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "model": "QEMU DVD-ROM",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "nr_requests": "2",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "parent": "/dev/sr0",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "partitions": {},
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "path": "/dev/sr0",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "removable": "1",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "rev": "2.5+",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "ro": "0",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "rotational": "1",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "sas_address": "",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "sas_device_handle": "",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "scheduler_mode": "mq-deadline",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "sectors": 0,
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "sectorsize": "2048",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "size": 493568.0,
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "support_discard": "2048",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "type": "disk",
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:             "vendor": "QEMU"
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:         }
Dec 06 09:39:42 compute-1 kind_rosalind[78496]:     }
Dec 06 09:39:42 compute-1 kind_rosalind[78496]: ]
Dec 06 09:39:42 compute-1 systemd[1]: libpod-4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51.scope: Deactivated successfully.
Dec 06 09:39:42 compute-1 podman[78480]: 2025-12-06 09:39:42.099678628 +0000 UTC m=+1.279944194 container died 4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_rosalind, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:39:44 compute-1 systemd[1]: var-lib-containers-storage-overlay-4464c5e5686d44a2ee9eeb629d19da23d71a23cdb9011766c7475be6581075d6-merged.mount: Deactivated successfully.
Dec 06 09:39:44 compute-1 podman[78480]: 2025-12-06 09:39:44.89426193 +0000 UTC m=+4.074527466 container remove 4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=kind_rosalind, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default)
Dec 06 09:39:44 compute-1 systemd[1]: libpod-conmon-4286e59e578a317748ac04b2acd4f6e373f521e1a054fe2bec7b59049a618c51.scope: Deactivated successfully.
Dec 06 09:39:44 compute-1 sudo[78375]: pam_unix(sudo:session): session closed for user root
Dec 06 09:39:47 compute-1 ceph-osd[77465]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 22.135 iops: 5666.545 elapsed_sec: 0.529
Dec 06 09:39:47 compute-1 ceph-osd[77465]: log_channel(cluster) log [WRN] : OSD bench result of 5666.545158 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 06 09:39:47 compute-1 ceph-osd[77465]: osd.0 0 waiting for initial osdmap
Dec 06 09:39:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0[77461]: 2025-12-06T09:39:47.884+0000 7fe7916a4640 -1 osd.0 0 waiting for initial osdmap
Dec 06 09:39:47 compute-1 ceph-osd[77465]: osd.0 11 crush map has features 288514051259236352, adjusting msgr requires for clients
Dec 06 09:39:47 compute-1 ceph-osd[77465]: osd.0 11 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Dec 06 09:39:47 compute-1 ceph-osd[77465]: osd.0 11 crush map has features 3314933000852226048, adjusting msgr requires for osds
Dec 06 09:39:47 compute-1 ceph-osd[77465]: osd.0 11 check_osdmap_features require_osd_release unknown -> squid
Dec 06 09:39:47 compute-1 ceph-osd[77465]: osd.0 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 06 09:39:47 compute-1 ceph-osd[77465]: osd.0 11 set_numa_affinity not setting numa affinity
Dec 06 09:39:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-osd-0[77461]: 2025-12-06T09:39:47.913+0000 7fe78cccc640 -1 osd.0 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Dec 06 09:39:47 compute-1 ceph-osd[77465]: osd.0 11 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Dec 06 09:39:48 compute-1 ceph-osd[77465]: osd.0 12 state: booting -> active
Dec 06 09:39:52 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 15 pg[3.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=15) [0] r=0 lpr=15 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:39:53 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 16 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=15) [0] r=0 lpr=15 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:39:54 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 17 pg[4.0( empty local-lis/les=0/0 n=0 ec=17/17 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:39:58 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 18 pg[4.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:01 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 19 pg[5.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [0] r=0 lpr=19 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:01 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 20 pg[5.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [0] r=0 lpr=19 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:04 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 21 pg[6.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [0] r=0 lpr=21 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 23 pg[6.0( empty local-lis/les=21/23 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [0] r=0 lpr=21 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:06 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 24 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=24 pruub=11.071418762s) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active pruub 41.233177185s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:06 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 24 pg[3.0( empty local-lis/les=15/16 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=24 pruub=11.071418762s) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown pruub 41.233177185s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.18( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.19( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.17( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.16( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.15( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.14( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.13( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.12( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.11( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.10( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.f( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.e( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.d( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.c( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.b( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.a( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.7( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.6( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[4.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=25 pruub=14.984453201s) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active pruub 46.179439545s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.5( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.2( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.3( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.4( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.8( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1a( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.9( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1b( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1c( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1d( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1e( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.18( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1f( empty local-lis/les=15/16 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[4.0( empty local-lis/les=17/18 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=25 pruub=14.984453201s) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown pruub 46.179439545s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.19( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.15( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.16( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.17( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.14( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.12( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.11( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.13( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.10( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.f( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.d( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.e( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.c( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.b( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.a( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.7( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.0( empty local-lis/les=24/25 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.2( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.6( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.4( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.5( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.8( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.3( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1a( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1c( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.9( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1b( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1e( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1f( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 25 pg[3.1d( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=15/15 les/c/f=16/16/0 sis=24) [0] r=0 lpr=24 pi=[15,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Dec 06 09:40:08 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1f( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1e( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.10( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.11( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.12( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.13( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.15( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.14( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.16( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.17( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.8( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.9( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.b( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.a( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.d( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.7( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.2( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.6( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.5( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.c( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.4( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.3( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.f( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.e( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1d( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1c( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1b( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1a( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.19( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.18( empty local-lis/les=17/18 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1f( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.10( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1e( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.11( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.13( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.12( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.14( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.16( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.17( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.8( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.9( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.15( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.b( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.a( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.0( empty local-lis/les=25/26 n=0 ec=17/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.7( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.2( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.6( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.5( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.4( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.3( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.e( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.f( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1a( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1b( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.19( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.1c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 26 pg[4.18( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=17/17 les/c/f=18/18/0 sis=25) [0] r=0 lpr=25 pi=[17,25)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:09 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Dec 06 09:40:09 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.19 scrub ok
Dec 06 09:40:09 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 27 pg[5.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=27 pruub=15.629227638s) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active pruub 49.513397217s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:09 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 27 pg[5.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=27 pruub=15.629227638s) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown pruub 49.513397217s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:10 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Dec 06 09:40:10 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Dec 06 09:40:10 compute-1 sudo[79446]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:40:10 compute-1 sudo[79446]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:40:10 compute-1 sudo[79446]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:10 compute-1 sudo[79471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:40:10 compute-1 sudo[79471]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:40:11 compute-1 podman[79539]: 2025-12-06 09:40:11.004294742 +0000 UTC m=+0.062250882 container create f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_bhaskara, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:40:11 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Dec 06 09:40:11 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Dec 06 09:40:11 compute-1 systemd[1]: Started libpod-conmon-f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524.scope.
Dec 06 09:40:11 compute-1 podman[79539]: 2025-12-06 09:40:10.972917471 +0000 UTC m=+0.030873631 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:40:11 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:40:11 compute-1 podman[79539]: 2025-12-06 09:40:11.104773265 +0000 UTC m=+0.162729425 container init f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_bhaskara, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 09:40:11 compute-1 podman[79539]: 2025-12-06 09:40:11.114458584 +0000 UTC m=+0.172414694 container start f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_bhaskara, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0)
Dec 06 09:40:11 compute-1 podman[79539]: 2025-12-06 09:40:11.118549223 +0000 UTC m=+0.176505333 container attach f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_bhaskara, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec 06 09:40:11 compute-1 confident_bhaskara[79556]: 167 167
Dec 06 09:40:11 compute-1 systemd[1]: libpod-f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524.scope: Deactivated successfully.
Dec 06 09:40:11 compute-1 podman[79539]: 2025-12-06 09:40:11.123368377 +0000 UTC m=+0.181324497 container died f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_bhaskara, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid)
Dec 06 09:40:11 compute-1 systemd[1]: var-lib-containers-storage-overlay-d3e3b58292f6affe50334b4deb300aee298730cf5f38e1a1c0b91f0bc74f7a3a-merged.mount: Deactivated successfully.
Dec 06 09:40:11 compute-1 podman[79539]: 2025-12-06 09:40:11.173087274 +0000 UTC m=+0.231043384 container remove f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=confident_bhaskara, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:40:11 compute-1 systemd[1]: libpod-conmon-f3e525ea47a5fc41abdec328bcd9b8261a3d335a3e3b4a068a5235b38541b524.scope: Deactivated successfully.
Dec 06 09:40:11 compute-1 podman[79572]: 2025-12-06 09:40:11.244641234 +0000 UTC m=+0.048435882 container create 15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_cori, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True)
Dec 06 09:40:11 compute-1 systemd[1]: Started libpod-conmon-15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095.scope.
Dec 06 09:40:11 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:40:11 compute-1 podman[79572]: 2025-12-06 09:40:11.22384255 +0000 UTC m=+0.027637228 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:40:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b493a9f615d2d02d11862f0cbeb950572f286e4c682b8067bcd8dc6d141f162/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:40:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b493a9f615d2d02d11862f0cbeb950572f286e4c682b8067bcd8dc6d141f162/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Dec 06 09:40:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b493a9f615d2d02d11862f0cbeb950572f286e4c682b8067bcd8dc6d141f162/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:40:11 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b493a9f615d2d02d11862f0cbeb950572f286e4c682b8067bcd8dc6d141f162/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Dec 06 09:40:11 compute-1 podman[79572]: 2025-12-06 09:40:11.338856626 +0000 UTC m=+0.142651274 container init 15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_cori, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1)
Dec 06 09:40:11 compute-1 podman[79572]: 2025-12-06 09:40:11.345698909 +0000 UTC m=+0.149493557 container start 15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_cori, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Dec 06 09:40:11 compute-1 podman[79572]: 2025-12-06 09:40:11.349573595 +0000 UTC m=+0.153368273 container attach 15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_cori, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325)
Dec 06 09:40:11 compute-1 systemd[1]: libpod-15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095.scope: Deactivated successfully.
Dec 06 09:40:11 compute-1 conmon[79588]: conmon 15545a7f3b9bbdb79c68 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095.scope/container/memory.events
Dec 06 09:40:11 compute-1 podman[79614]: 2025-12-06 09:40:11.476486912 +0000 UTC m=+0.025013677 container died 15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_cori, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Dec 06 09:40:11 compute-1 systemd[1]: var-lib-containers-storage-overlay-5b493a9f615d2d02d11862f0cbeb950572f286e4c682b8067bcd8dc6d141f162-merged.mount: Deactivated successfully.
Dec 06 09:40:11 compute-1 podman[79614]: 2025-12-06 09:40:11.510424552 +0000 UTC m=+0.058951307 container remove 15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_cori, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:40:11 compute-1 systemd[1]: libpod-conmon-15545a7f3b9bbdb79c6808e090321368094bcfb317db18faf592947982bf2095.scope: Deactivated successfully.
Dec 06 09:40:11 compute-1 systemd[1]: Reloading.
Dec 06 09:40:11 compute-1 systemd-sysv-generator[79656]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:11 compute-1 systemd-rc-local-generator[79650]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:11 compute-1 systemd[1]: Reloading.
Dec 06 09:40:12 compute-1 systemd-sysv-generator[79693]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:12 compute-1 systemd-rc-local-generator[79690]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:12 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Dec 06 09:40:12 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Dec 06 09:40:12 compute-1 systemd[1]: Starting Ceph mon.compute-1 for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:40:12 compute-1 podman[79751]: 2025-12-06 09:40:12.489001741 +0000 UTC m=+0.055319697 container create d320de814b2790a418a9af21e3ae56e9af7093005540777d0244e1c42ff347ad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:40:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d326c8a7dc075b23ae3538f3eed9f441140fd37896ca58980e1d734a506dc82a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:40:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d326c8a7dc075b23ae3538f3eed9f441140fd37896ca58980e1d734a506dc82a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:40:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d326c8a7dc075b23ae3538f3eed9f441140fd37896ca58980e1d734a506dc82a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 09:40:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d326c8a7dc075b23ae3538f3eed9f441140fd37896ca58980e1d734a506dc82a/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Dec 06 09:40:12 compute-1 podman[79751]: 2025-12-06 09:40:12.552978005 +0000 UTC m=+0.119295981 container init d320de814b2790a418a9af21e3ae56e9af7093005540777d0244e1c42ff347ad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 06 09:40:12 compute-1 podman[79751]: 2025-12-06 09:40:12.465356931 +0000 UTC m=+0.031674937 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:40:12 compute-1 podman[79751]: 2025-12-06 09:40:12.562031941 +0000 UTC m=+0.128349897 container start d320de814b2790a418a9af21e3ae56e9af7093005540777d0244e1c42ff347ad (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mon-compute-1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 06 09:40:12 compute-1 bash[79751]: d320de814b2790a418a9af21e3ae56e9af7093005540777d0244e1c42ff347ad
Dec 06 09:40:12 compute-1 systemd[1]: Started Ceph mon.compute-1 for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:40:12 compute-1 ceph-mon[79770]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 09:40:12 compute-1 ceph-mon[79770]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Dec 06 09:40:12 compute-1 ceph-mon[79770]: pidfile_write: ignore empty --pid-file
Dec 06 09:40:12 compute-1 ceph-mon[79770]: load: jerasure load: lrc 
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: RocksDB version: 7.9.2
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Git sha 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Compile date 2025-07-17 03:12:14
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: DB SUMMARY
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: DB Session ID:  1TK25AVRA1WQDS4JHM8T
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: CURRENT file:  CURRENT
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: IDENTITY file:  IDENTITY
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-1/store.db dir, Total Num: 0, files: 
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-1/store.db: 000004.log size: 511 ; 
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                         Options.error_if_exists: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                       Options.create_if_missing: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                         Options.paranoid_checks: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.flush_verify_memtable_count: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                                     Options.env: 0x55fbbd682c20
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                                      Options.fs: PosixFileSystem
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                                Options.info_log: 0x55fbbecdba20
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                Options.max_file_opening_threads: 16
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                              Options.statistics: (nil)
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                               Options.use_fsync: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                       Options.max_log_file_size: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                   Options.log_file_time_to_roll: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                       Options.keep_log_file_num: 1000
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                    Options.recycle_log_file_num: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                         Options.allow_fallocate: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                        Options.allow_mmap_reads: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                       Options.allow_mmap_writes: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                        Options.use_direct_reads: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:          Options.create_missing_column_families: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                              Options.db_log_dir: 
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                                 Options.wal_dir: 
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                Options.table_cache_numshardbits: 6
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                         Options.WAL_ttl_seconds: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                       Options.WAL_size_limit_MB: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.manifest_preallocation_size: 4194304
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                     Options.is_fd_close_on_exec: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                   Options.advise_random_on_open: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                    Options.db_write_buffer_size: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                    Options.write_buffer_manager: 0x55fbbecdf900
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         Options.access_hint_on_compaction_start: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                      Options.use_adaptive_mutex: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                            Options.rate_limiter: (nil)
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                       Options.wal_recovery_mode: 2
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                  Options.enable_thread_tracking: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                  Options.enable_pipelined_write: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                  Options.unordered_write: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.write_thread_max_yield_usec: 100
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                               Options.row_cache: None
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                              Options.wal_filter: None
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.avoid_flush_during_recovery: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.allow_ingest_behind: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.two_write_queues: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.manual_wal_flush: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.wal_compression: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.atomic_flush: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                 Options.persist_stats_to_disk: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                 Options.write_dbid_to_manifest: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                 Options.log_readahead_size: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                 Options.best_efforts_recovery: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.allow_data_in_errors: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.db_host_id: __hostname__
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.enforce_single_del_contracts: true
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.max_background_jobs: 2
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.max_background_compactions: -1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.max_subcompactions: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.delayed_write_rate : 16777216
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.max_total_wal_size: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                   Options.stats_dump_period_sec: 600
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                 Options.stats_persist_period_sec: 600
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                          Options.max_open_files: -1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                          Options.bytes_per_sync: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                      Options.wal_bytes_per_sync: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                   Options.strict_bytes_per_sync: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:       Options.compaction_readahead_size: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                  Options.max_background_flushes: -1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Compression algorithms supported:
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         kZSTD supported: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         kXpressCompression supported: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         kBZip2Compression supported: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         kZSTDNotFinalCompression supported: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         kLZ4Compression supported: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         kZlibCompression supported: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         kLZ4HCCompression supported: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         kSnappyCompression supported: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Fast CRC32 supported: Supported on x86
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: DMutex implementation: pthread_mutex_t
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:           Options.merge_operator: 
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:        Options.compaction_filter: None
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:        Options.compaction_filter_factory: None
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:  Options.sst_partitioner_factory: None
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         Options.memtable_factory: SkipListFactory
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:            Options.table_factory: BlockBasedTable
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55fbbecda5c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55fbbecff350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:        Options.write_buffer_size: 33554432
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:  Options.max_write_buffer_number: 2
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:          Options.compression: NoCompression
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                  Options.bottommost_compression: Disabled
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:       Options.prefix_extractor: nullptr
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.num_levels: 7
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:            Options.compression_opts.window_bits: -14
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                  Options.compression_opts.level: 32767
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:               Options.compression_opts.strategy: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         Options.compression_opts.parallel_threads: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                  Options.compression_opts.enabled: false
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:              Options.level0_stop_writes_trigger: 36
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                   Options.target_file_size_base: 67108864
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:             Options.target_file_size_multiplier: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                        Options.arena_block_size: 1048576
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                Options.disable_auto_compactions: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                   Options.inplace_update_support: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                 Options.inplace_update_num_locks: 10000
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:               Options.memtable_whole_key_filtering: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:   Options.memtable_huge_page_size: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                           Options.bloom_locality: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                    Options.max_successive_merges: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                Options.optimize_filters_for_hits: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                Options.paranoid_file_checks: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                Options.force_consistency_checks: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                Options.report_bg_io_stats: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                               Options.ttl: 2592000
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:          Options.periodic_compaction_seconds: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:    Options.preserve_internal_time_seconds: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                       Options.enable_blob_files: false
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                           Options.min_blob_size: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                          Options.blob_file_size: 268435456
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                   Options.blob_compression_type: NoCompression
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:          Options.enable_blob_garbage_collection: false
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:          Options.blob_compaction_readahead_size: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb:                Options.blob_file_starting_level: 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014012604679, "job": 1, "event": "recovery_started", "wal_files": [4]}
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014012606628, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014012606736, "job": 1, "event": "recovery_finished"}
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Dec 06 09:40:12 compute-1 sudo[79471]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55fbbed00e00
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: DB pointer 0x55fbbee0a000
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1 does not exist in monmap, will attempt to join an existing cluster
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:40:12 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fbbecff350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 5.7e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 06 09:40:12 compute-1 ceph-mon[79770]: using public_addr v2:192.168.122.101:0/0 -> [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Dec 06 09:40:12 compute-1 ceph-mon[79770]: starting mon.compute-1 rank -1 at public addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] at bind addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-1 fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(???) e0 preinit fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).mds e1 new map
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).mds e1 print_map
                                           e1
                                           btime 2025-12-06T09:37:41:285728+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: -1
                                            
                                           No filesystems configured
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e9 e9: 2 total, 0 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e11 e11: 2 total, 1 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e13 e13: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e14 e14: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e15 e15: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e16 e16: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e26 e26: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e27 e27: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e27 crush map has features 3314933000852226048, adjusting msgr requires
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e27 crush map has features 288514051259236352, adjusting msgr requires
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e27 crush map has features 288514051259236352, adjusting msgr requires
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).osd e27 crush map has features 288514051259236352, adjusting msgr requires
Dec 06 09:40:12 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Dec 06 09:40:12 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2735601092' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Dec 06 09:40:12 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Dec 06 09:40:12 compute-1 ceph-mon[79770]: osdmap e25: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Dec 06 09:40:12 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:12 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/250124401' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Dec 06 09:40:12 compute-1 ceph-mon[79770]: 2.1d scrub starts
Dec 06 09:40:12 compute-1 ceph-mon[79770]: 2.1d scrub ok
Dec 06 09:40:12 compute-1 ceph-mon[79770]: 3.18 scrub starts
Dec 06 09:40:12 compute-1 ceph-mon[79770]: 3.18 scrub ok
Dec 06 09:40:12 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/250124401' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Dec 06 09:40:12 compute-1 ceph-mon[79770]: osdmap e26: 2 total, 2 up, 2 in
Dec 06 09:40:12 compute-1 ceph-mon[79770]: pgmap v82: 100 pgs: 1 peering, 94 unknown, 5 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:12 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 06 09:40:12 compute-1 ceph-mon[79770]: mon.compute-1@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Dec 06 09:40:13 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.17 scrub starts
Dec 06 09:40:13 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.17 scrub ok
Dec 06 09:40:13 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.12 scrub starts
Dec 06 09:40:13 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.12 scrub ok
Dec 06 09:40:14 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.11 deep-scrub starts
Dec 06 09:40:14 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.11 deep-scrub ok
Dec 06 09:40:15 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Dec 06 09:40:15 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Dec 06 09:40:16 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Dec 06 09:40:16 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Dec 06 09:40:17 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.f scrub starts
Dec 06 09:40:17 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.f scrub ok
Dec 06 09:40:18 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.e scrub starts
Dec 06 09:40:18 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.e scrub ok
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1f( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.13( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1e( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.10( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.12( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.15( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.14( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.17( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.11( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.16( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.9( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.8( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.b( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.a( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.d( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.c( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.6( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.3( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.7( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.4( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.5( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.2( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.e( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.f( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1c( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1d( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1a( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1b( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.19( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.18( empty local-lis/les=19/20 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1f( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.13( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.12( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.10( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.11( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.15( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.17( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.14( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.16( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.9( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.a( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.c( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.d( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.6( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1e( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.0( empty local-lis/les=27/28 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.3( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.8( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.4( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.7( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.5( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.e( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.f( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.2( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1c( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1d( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1a( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.1b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.19( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 28 pg[5.18( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=19/19 les/c/f=20/20/0 sis=27) [0] r=0 lpr=27 pi=[19,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:19 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.d scrub starts
Dec 06 09:40:19 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.d scrub ok
Dec 06 09:40:20 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.b deep-scrub starts
Dec 06 09:40:20 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.b deep-scrub ok
Dec 06 09:40:22 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.7 deep-scrub starts
Dec 06 09:40:22 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.7 deep-scrub ok
Dec 06 09:40:22 compute-1 ceph-mon[79770]: mon.compute-1@-1(probing) e3  my rank is now 2 (was -1)
Dec 06 09:40:22 compute-1 ceph-mon[79770]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Dec 06 09:40:22 compute-1 ceph-mon[79770]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Dec 06 09:40:22 compute-1 ceph-mon[79770]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 09:40:23 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.a deep-scrub starts
Dec 06 09:40:23 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.a deep-scrub ok
Dec 06 09:40:24 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.0 scrub starts
Dec 06 09:40:24 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.0 scrub ok
Dec 06 09:40:25 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.c scrub starts
Dec 06 09:40:25 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.c scrub ok
Dec 06 09:40:25 compute-1 ceph-mon[79770]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 09:40:25 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Dec 06 09:40:25 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Dec 06 09:40:25 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e28 e28: 2 total, 2 up, 2 in
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.1f scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.1f scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.16 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.16 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: Deploying daemon mon.compute-1 on compute-1
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3524701111' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: mon.compute-0 calling monitor election
Dec 06 09:40:26 compute-1 ceph-mon[79770]: pgmap v84: 131 pgs: 1 peering, 62 unknown, 68 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.8 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.8 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.14 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.14 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.1b scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.1b scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.15 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.15 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: pgmap v85: 131 pgs: 31 unknown, 100 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.17 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.9 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.17 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.9 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.12 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.12 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.a deep-scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.a deep-scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: pgmap v86: 131 pgs: 1 peering, 31 unknown, 99 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.11 deep-scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.11 deep-scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.1e scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.1e scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.10 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.10 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.2 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.2 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: pgmap v87: 131 pgs: 1 peering, 31 unknown, 99 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.13 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.13 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.6 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.6 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.f scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.5 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.5 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: pgmap v88: 131 pgs: 1 peering, 31 unknown, 99 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Dec 06 09:40:26 compute-1 ceph-mon[79770]: monmap epoch 2
Dec 06 09:40:26 compute-1 ceph-mon[79770]: mon.compute-2 calling monitor election
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.f scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.e scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:40:26 compute-1 ceph-mon[79770]: last_changed 2025-12-06T09:40:10.449868+0000
Dec 06 09:40:26 compute-1 ceph-mon[79770]: created 2025-12-06T09:37:38.663870+0000
Dec 06 09:40:26 compute-1 ceph-mon[79770]: min_mon_release 19 (squid)
Dec 06 09:40:26 compute-1 ceph-mon[79770]: election_strategy: 1
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
Dec 06 09:40:26 compute-1 ceph-mon[79770]: fsmap 
Dec 06 09:40:26 compute-1 ceph-mon[79770]: osdmap e27: 2 total, 2 up, 2 in
Dec 06 09:40:26 compute-1 ceph-mon[79770]: mgrmap e9: compute-0.qhdjwa(active, since 2m)
Dec 06 09:40:26 compute-1 ceph-mon[79770]: Health detail: HEALTH_WARN 3 pool(s) do not have an application enabled
Dec 06 09:40:26 compute-1 ceph-mon[79770]: [WRN] POOL_APP_NOT_ENABLED: 3 pool(s) do not have an application enabled
Dec 06 09:40:26 compute-1 ceph-mon[79770]:     application not enabled on pool 'images'
Dec 06 09:40:26 compute-1 ceph-mon[79770]:     application not enabled on pool 'cephfs.cephfs.meta'
Dec 06 09:40:26 compute-1 ceph-mon[79770]:     application not enabled on pool 'cephfs.cephfs.data'
Dec 06 09:40:26 compute-1 ceph-mon[79770]:     use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.e scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.0 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.0 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3524701111' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Dec 06 09:40:26 compute-1 ceph-mon[79770]: osdmap e28: 2 total, 2 up, 2 in
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.oazbvn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.oazbvn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.d scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.d scrub ok
Dec 06 09:40:26 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Dec 06 09:40:26 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Dec 06 09:40:26 compute-1 ceph-mon[79770]: mgrc update_daemon_metadata mon.compute-1 metadata {addrs=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-1,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025,kernel_version=5.14.0-645.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864312,os=Linux}
Dec 06 09:40:26 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e29 e29: 2 total, 2 up, 2 in
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.b deep-scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: mon.compute-0 calling monitor election
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.b deep-scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: mon.compute-2 calling monitor election
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.3 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.3 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1898003818' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.7 deep-scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.7 deep-scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.b scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.b scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: pgmap v91: 131 pgs: 1 peering, 31 unknown, 99 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.a deep-scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.a deep-scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.c scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.c scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.0 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.0 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.4 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.4 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: pgmap v92: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.c scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 3.c scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.1 scrub starts
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2.1 scrub ok
Dec 06 09:40:26 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:40:26 compute-1 ceph-mon[79770]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Dec 06 09:40:26 compute-1 ceph-mon[79770]: monmap epoch 3
Dec 06 09:40:26 compute-1 ceph-mon[79770]: fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:40:26 compute-1 ceph-mon[79770]: last_changed 2025-12-06T09:40:20.714037+0000
Dec 06 09:40:26 compute-1 ceph-mon[79770]: created 2025-12-06T09:37:38.663870+0000
Dec 06 09:40:26 compute-1 ceph-mon[79770]: min_mon_release 19 (squid)
Dec 06 09:40:26 compute-1 ceph-mon[79770]: election_strategy: 1
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
Dec 06 09:40:26 compute-1 ceph-mon[79770]: 2: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.compute-1
Dec 06 09:40:26 compute-1 ceph-mon[79770]: fsmap 
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.11( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.702283859s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330039978s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-mon[79770]: osdmap e28: 2 total, 2 up, 2 in
Dec 06 09:40:26 compute-1 ceph-mon[79770]: mgrmap e9: compute-0.qhdjwa(active, since 2m)
Dec 06 09:40:26 compute-1 ceph-mon[79770]: Health detail: HEALTH_WARN 3 pool(s) do not have an application enabled
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.11( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.702230453s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330039978s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-mon[79770]: [WRN] POOL_APP_NOT_ENABLED: 3 pool(s) do not have an application enabled
Dec 06 09:40:26 compute-1 ceph-mon[79770]:     application not enabled on pool 'images'
Dec 06 09:40:26 compute-1 ceph-mon[79770]:     application not enabled on pool 'cephfs.cephfs.meta'
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1f( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.698268890s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.326400757s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1f( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.578024864s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206153870s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1f( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.698243141s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.326400757s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1f( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577967644s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206153870s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.16( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572353363s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200588226s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.16( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572320938s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200588226s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.15( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572243690s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200527191s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.15( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572222710s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200527191s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.10( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701685905s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330024719s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-mon[79770]:     application not enabled on pool 'cephfs.cephfs.data'
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.14( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572280884s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200660706s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.10( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701644897s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330024719s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.15( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701715469s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330139160s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.14( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572256088s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200660706s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.15( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701698303s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330139160s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-mon[79770]:     use 'ceph osd pool application enable <pool-name> <app-name>', where <app-name> is 'cephfs', 'rbd', 'rgw', or freeform for custom applications.
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.13( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572258949s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200714111s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.13( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.578099251s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206558228s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.13( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572238922s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200714111s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.13( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.578079224s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206558228s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.11( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572110176s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200721741s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.15( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.578001022s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206657410s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.11( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572064400s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200721741s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.16( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701502800s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330177307s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.15( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577963829s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206657410s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.10( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572211266s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200912476s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.9( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701452255s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330184937s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.10( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572188377s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200912476s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.16( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701453209s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330177307s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.9( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701428413s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330184937s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.f( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572074890s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200874329s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.8( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577779770s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206588745s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.f( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.572056770s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200874329s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.8( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577763557s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206588745s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.9( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577795029s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206672668s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.e( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571990013s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200889587s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.9( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577779770s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206672668s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.d( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571936607s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200893402s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.e( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571940422s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200889587s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.a( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577631950s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206611633s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.d( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571908951s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200893402s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.a( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577610016s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206611633s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.c( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571833611s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200904846s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577974319s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.207054138s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577961922s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.207054138s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.c( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571818352s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200904846s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.a( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571742058s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.200965881s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577619553s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206855774s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577604294s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206855774s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.a( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.571721077s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.200965881s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.701019287s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330410004s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577396393s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206832886s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700989723s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330410004s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.5( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578357697s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.207824707s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577382088s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206832886s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.5( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578341484s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.207824707s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.7( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700942039s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330444336s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.7( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700921059s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330444336s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.4( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700901985s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330451965s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.5( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577339172s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.206924438s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.4( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700880051s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330451965s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.5( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577327728s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.206924438s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.3( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578140259s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.207798004s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.3( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578126907s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.207798004s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.2( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700796127s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330554962s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.e( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700776100s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330554962s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.2( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700767517s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330554962s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.f( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700776100s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330581665s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.e( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700756073s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330554962s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.f( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700746536s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330581665s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.e( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577299118s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.207168579s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.9( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578253746s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.208145142s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.e( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.577277184s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.207168579s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.9( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578232765s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.208145142s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1c( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700592995s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330585480s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1a( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700614929s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330642700s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1c( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700572968s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330585480s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.1a( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578066826s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.208156586s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1a( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700556755s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330642700s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1b( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.579343796s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.209487915s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.1a( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578047752s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.208156586s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1b( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.579326630s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.209487915s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.1c( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.577960014s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.208160400s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700434685s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330650330s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1a( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.579102516s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.209335327s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.1b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700411797s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330650330s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.1a( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.579081535s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.209335327s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.1c( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.577885628s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.208160400s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.1d( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578135490s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active pruub 63.208438873s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[3.1d( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=29 pruub=12.578117371s) [1] r=-1 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 63.208438873s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.18( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700207710s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 active pruub 59.330684662s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.18( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.579350471s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 active pruub 64.209854126s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[4.18( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=29 pruub=13.579333305s) [1] r=-1 lpr=29 pi=[25,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 64.209854126s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[5.18( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=29 pruub=8.700164795s) [1] r=-1 lpr=29 pi=[27,29)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 59.330684662s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.19( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.15( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.13( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.d( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.c( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.10( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.e( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.1( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.4( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.6( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.9( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.a( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.1f( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.1e( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:26 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 29 pg[2.1b( empty local-lis/les=0/0 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:40:26 compute-1 sudo[79810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:40:26 compute-1 sudo[79810]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:40:26 compute-1 sudo[79810]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:26 compute-1 sudo[79835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:40:26 compute-1 sudo[79835]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:40:27 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.1e scrub starts
Dec 06 09:40:27 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.1e scrub ok
Dec 06 09:40:27 compute-1 podman[79899]: 2025-12-06 09:40:27.404950584 +0000 UTC m=+0.043297892 container create aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_hermann, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True)
Dec 06 09:40:27 compute-1 systemd[1]: Started libpod-conmon-aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f.scope.
Dec 06 09:40:27 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:40:27 compute-1 podman[79899]: 2025-12-06 09:40:27.386575567 +0000 UTC m=+0.024922895 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:40:27 compute-1 podman[79899]: 2025-12-06 09:40:27.489960257 +0000 UTC m=+0.128307635 container init aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_hermann, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 06 09:40:27 compute-1 podman[79899]: 2025-12-06 09:40:27.497676778 +0000 UTC m=+0.136024096 container start aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_hermann, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:40:27 compute-1 podman[79899]: 2025-12-06 09:40:27.502180245 +0000 UTC m=+0.140527603 container attach aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_hermann, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True)
Dec 06 09:40:27 compute-1 ecstatic_hermann[79916]: 167 167
Dec 06 09:40:27 compute-1 systemd[1]: libpod-aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f.scope: Deactivated successfully.
Dec 06 09:40:27 compute-1 podman[79899]: 2025-12-06 09:40:27.504448199 +0000 UTC m=+0.142795517 container died aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_hermann, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2)
Dec 06 09:40:27 compute-1 systemd[1]: var-lib-containers-storage-overlay-919f56925ff5c04dc21b36e230b03c3c9513857396e97c0edb2448e1fd51ab3d-merged.mount: Deactivated successfully.
Dec 06 09:40:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e29 _set_new_cache_sizes cache_size:1019933393 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:40:27 compute-1 podman[79899]: 2025-12-06 09:40:27.848581931 +0000 UTC m=+0.486929269 container remove aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_hermann, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:40:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e30 e30: 2 total, 2 up, 2 in
Dec 06 09:40:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.1b( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.9( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:27 compute-1 ceph-mon[79770]: mon.compute-1 calling monitor election
Dec 06 09:40:27 compute-1 ceph-mon[79770]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Dec 06 09:40:27 compute-1 ceph-mon[79770]: 3.4 scrub starts
Dec 06 09:40:27 compute-1 ceph-mon[79770]: 3.4 scrub ok
Dec 06 09:40:27 compute-1 ceph-mon[79770]: 2.e deep-scrub starts
Dec 06 09:40:27 compute-1 ceph-mon[79770]: 2.e deep-scrub ok
Dec 06 09:40:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.1e( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.6( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.1( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:27 compute-1 ceph-mon[79770]: pgmap v93: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1898003818' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Dec 06 09:40:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.4( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 06 09:40:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.1f( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.d( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 06 09:40:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.c( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.a( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 06 09:40:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.13( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.10( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 06 09:40:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.15( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.19( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 30 pg[2.e( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=29) [0] r=0 lpr=29 pi=[24,29)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:27 compute-1 ceph-mon[79770]: osdmap e29: 2 total, 2 up, 2 in
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.sauzid", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.sauzid", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:40:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/21529314' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Dec 06 09:40:27 compute-1 systemd[1]: libpod-conmon-aa0c2e021053773e86ba2ee604d884fd82322c9ac8a8b4f77f388cc038511e8f.scope: Deactivated successfully.
Dec 06 09:40:28 compute-1 systemd[1]: Reloading.
Dec 06 09:40:28 compute-1 systemd-rc-local-generator[79960]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:28 compute-1 systemd-sysv-generator[79964]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:28 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Dec 06 09:40:28 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Dec 06 09:40:28 compute-1 systemd[1]: Reloading.
Dec 06 09:40:28 compute-1 systemd-rc-local-generator[80003]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:40:28 compute-1 systemd-sysv-generator[80007]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:40:28 compute-1 systemd[1]: Starting Ceph mgr.compute-1.sauzid for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:40:28 compute-1 ceph-mon[79770]: Deploying daemon mgr.compute-1.sauzid on compute-1
Dec 06 09:40:28 compute-1 ceph-mon[79770]: 4.1e scrub starts
Dec 06 09:40:28 compute-1 ceph-mon[79770]: 4.1e scrub ok
Dec 06 09:40:28 compute-1 ceph-mon[79770]: 2.1a scrub starts
Dec 06 09:40:28 compute-1 ceph-mon[79770]: 2.1a scrub ok
Dec 06 09:40:28 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 06 09:40:28 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 06 09:40:28 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 06 09:40:28 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 06 09:40:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/21529314' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Dec 06 09:40:28 compute-1 ceph-mon[79770]: osdmap e30: 2 total, 2 up, 2 in
Dec 06 09:40:28 compute-1 ceph-mon[79770]: 2.18 scrub starts
Dec 06 09:40:28 compute-1 ceph-mon[79770]: 2.18 scrub ok
Dec 06 09:40:28 compute-1 ceph-mon[79770]: pgmap v96: 131 pgs: 47 peering, 84 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:28 compute-1 podman[80060]: 2025-12-06 09:40:28.922782339 +0000 UTC m=+0.062874754 container create 66d946b34f9046b885a4188f19fa23f79edaa2e9c5ef5e17f29d5748ef54b8c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:40:28 compute-1 podman[80060]: 2025-12-06 09:40:28.889872609 +0000 UTC m=+0.029965114 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:40:28 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11a98e9a6e8fe2a8669b6179c00f600d296c8ae58ca091f39e6db3a2e4116e5b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:40:28 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11a98e9a6e8fe2a8669b6179c00f600d296c8ae58ca091f39e6db3a2e4116e5b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:40:28 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11a98e9a6e8fe2a8669b6179c00f600d296c8ae58ca091f39e6db3a2e4116e5b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 09:40:28 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/11a98e9a6e8fe2a8669b6179c00f600d296c8ae58ca091f39e6db3a2e4116e5b/merged/var/lib/ceph/mgr/ceph-compute-1.sauzid supports timestamps until 2038 (0x7fffffff)
Dec 06 09:40:29 compute-1 podman[80060]: 2025-12-06 09:40:29.007606718 +0000 UTC m=+0.147699153 container init 66d946b34f9046b885a4188f19fa23f79edaa2e9c5ef5e17f29d5748ef54b8c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 06 09:40:29 compute-1 podman[80060]: 2025-12-06 09:40:29.017081472 +0000 UTC m=+0.157173887 container start 66d946b34f9046b885a4188f19fa23f79edaa2e9c5ef5e17f29d5748ef54b8c9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 06 09:40:29 compute-1 bash[80060]: 66d946b34f9046b885a4188f19fa23f79edaa2e9c5ef5e17f29d5748ef54b8c9
Dec 06 09:40:29 compute-1 systemd[1]: Started Ceph mgr.compute-1.sauzid for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:40:29 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Dec 06 09:40:29 compute-1 sudo[79835]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:29 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Dec 06 09:40:29 compute-1 ceph-mgr[80080]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 09:40:29 compute-1 ceph-mgr[80080]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec 06 09:40:29 compute-1 ceph-mgr[80080]: pidfile_write: ignore empty --pid-file
Dec 06 09:40:29 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'alerts'
Dec 06 09:40:29 compute-1 ceph-mgr[80080]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 09:40:29 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'balancer'
Dec 06 09:40:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:29.230+0000 7fa301e39140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 09:40:29 compute-1 ceph-mgr[80080]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 09:40:29 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'cephadm'
Dec 06 09:40:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:29.314+0000 7fa301e39140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 09:40:30 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'crash'
Dec 06 09:40:30 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Dec 06 09:40:30 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Dec 06 09:40:30 compute-1 ceph-mgr[80080]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 09:40:30 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'dashboard'
Dec 06 09:40:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:30.142+0000 7fa301e39140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 09:40:30 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'devicehealth'
Dec 06 09:40:31 compute-1 ceph-mgr[80080]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 09:40:31 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'diskprediction_local'
Dec 06 09:40:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:31.008+0000 7fa301e39140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 09:40:31 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.13 deep-scrub starts
Dec 06 09:40:31 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.13 deep-scrub ok
Dec 06 09:40:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 06 09:40:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 06 09:40:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]:   from numpy import show_config as show_numpy_config
Dec 06 09:40:31 compute-1 ceph-mgr[80080]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 09:40:31 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'influx'
Dec 06 09:40:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:31.510+0000 7fa301e39140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 09:40:31 compute-1 ceph-mgr[80080]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 09:40:31 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'insights'
Dec 06 09:40:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:31.589+0000 7fa301e39140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 09:40:31 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'iostat'
Dec 06 09:40:31 compute-1 ceph-mgr[80080]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 09:40:31 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'k8sevents'
Dec 06 09:40:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:31.765+0000 7fa301e39140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 09:40:32 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Dec 06 09:40:32 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Dec 06 09:40:32 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'localpool'
Dec 06 09:40:32 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'mds_autoscaler'
Dec 06 09:40:32 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'mirroring'
Dec 06 09:40:32 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'nfs'
Dec 06 09:40:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e30 _set_new_cache_sizes cache_size:1020053179 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:40:33 compute-1 ceph-mgr[80080]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 09:40:33 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'orchestrator'
Dec 06 09:40:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:33.159+0000 7fa301e39140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 09:40:33 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.12 deep-scrub starts
Dec 06 09:40:33 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.12 deep-scrub ok
Dec 06 09:40:33 compute-1 ceph-mgr[80080]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 09:40:33 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'osd_perf_query'
Dec 06 09:40:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:33.420+0000 7fa301e39140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 09:40:33 compute-1 ceph-mgr[80080]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 09:40:33 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'osd_support'
Dec 06 09:40:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:33.599+0000 7fa301e39140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 09:40:33 compute-1 ceph-mgr[80080]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 09:40:33 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'pg_autoscaler'
Dec 06 09:40:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:33.691+0000 7fa301e39140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 09:40:33 compute-1 ceph-mgr[80080]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 09:40:33 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'progress'
Dec 06 09:40:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:33.803+0000 7fa301e39140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 09:40:33 compute-1 ceph-mgr[80080]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 09:40:33 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'prometheus'
Dec 06 09:40:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:33.893+0000 7fa301e39140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 09:40:34 compute-1 ceph-mon[79770]: 5.1e scrub starts
Dec 06 09:40:34 compute-1 ceph-mon[79770]: 5.1e scrub ok
Dec 06 09:40:34 compute-1 ceph-mon[79770]: Health check cleared: POOL_APP_NOT_ENABLED (was: 2 pool(s) do not have an application enabled)
Dec 06 09:40:34 compute-1 ceph-mon[79770]: Cluster is now healthy
Dec 06 09:40:34 compute-1 ceph-mon[79770]: 2.17 scrub starts
Dec 06 09:40:34 compute-1 ceph-mon[79770]: 2.17 scrub ok
Dec 06 09:40:34 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:34 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Dec 06 09:40:34 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Dec 06 09:40:34 compute-1 ceph-mgr[80080]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 09:40:34 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'rbd_support'
Dec 06 09:40:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:34.329+0000 7fa301e39140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 09:40:34 compute-1 ceph-mgr[80080]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 09:40:34 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'restful'
Dec 06 09:40:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:34.546+0000 7fa301e39140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 09:40:34 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'rgw'
Dec 06 09:40:35 compute-1 ceph-mgr[80080]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 09:40:35 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'rook'
Dec 06 09:40:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:35.093+0000 7fa301e39140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 09:40:35 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.14 scrub starts
Dec 06 09:40:35 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.14 scrub ok
Dec 06 09:40:36 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.17 scrub starts
Dec 06 09:40:36 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.17 scrub ok
Dec 06 09:40:36 compute-1 ceph-mgr[80080]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 09:40:36 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'selftest'
Dec 06 09:40:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:36.294+0000 7fa301e39140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 09:40:36 compute-1 ceph-mgr[80080]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 09:40:36 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'snap_schedule'
Dec 06 09:40:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:36.386+0000 7fa301e39140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 09:40:36 compute-1 ceph-mgr[80080]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 06 09:40:36 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'stats'
Dec 06 09:40:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:36.489+0000 7fa301e39140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 06 09:40:36 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'status'
Dec 06 09:40:36 compute-1 ceph-mgr[80080]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 09:40:36 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'telegraf'
Dec 06 09:40:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:36.882+0000 7fa301e39140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 09:40:36 compute-1 ceph-mgr[80080]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 09:40:36 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'telemetry'
Dec 06 09:40:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:36.972+0000 7fa301e39140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 09:40:37 compute-1 ceph-mgr[80080]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 09:40:37 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'test_orchestrator'
Dec 06 09:40:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:37.181+0000 7fa301e39140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 09:40:37 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.16 scrub starts
Dec 06 09:40:37 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.16 scrub ok
Dec 06 09:40:37 compute-1 ceph-mgr[80080]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 09:40:37 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'volumes'
Dec 06 09:40:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:37.457+0000 7fa301e39140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 09:40:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e30 _set_new_cache_sizes cache_size:1020054712 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:40:38 compute-1 ceph-mgr[80080]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 09:40:38 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'zabbix'
Dec 06 09:40:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:38.079+0000 7fa301e39140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 09:40:38 compute-1 ceph-mgr[80080]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 09:40:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:40:38.175+0000 7fa301e39140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 09:40:38 compute-1 ceph-mgr[80080]: ms_deliver_dispatch: unhandled message 0x56071dbacd00 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Dec 06 09:40:38 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.17 deep-scrub starts
Dec 06 09:40:38 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.17 deep-scrub ok
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 4.10 scrub starts
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 4.10 scrub ok
Dec 06 09:40:39 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 4.11 scrub starts
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 4.11 scrub ok
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 2.16 scrub starts
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 2.16 scrub ok
Dec 06 09:40:39 compute-1 ceph-mon[79770]: pgmap v97: 131 pgs: 47 peering, 84 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 5.13 deep-scrub starts
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 5.13 deep-scrub ok
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 2.14 scrub starts
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 2.14 scrub ok
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 4.12 scrub starts
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 4.12 scrub ok
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 2.12 deep-scrub starts
Dec 06 09:40:39 compute-1 ceph-mon[79770]: pgmap v98: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 2.12 deep-scrub ok
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 5.12 deep-scrub starts
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 5.12 deep-scrub ok
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 2.11 scrub starts
Dec 06 09:40:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2318794964' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 2.11 scrub ok
Dec 06 09:40:39 compute-1 ceph-mon[79770]: Standby manager daemon compute-2.oazbvn started
Dec 06 09:40:39 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:39 compute-1 ceph-mon[79770]: mgrmap e10: compute-0.qhdjwa(active, since 2m), standbys: compute-2.oazbvn
Dec 06 09:40:39 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr metadata", "who": "compute-2.oazbvn", "id": "compute-2.oazbvn"}]: dispatch
Dec 06 09:40:39 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2318794964' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Dec 06 09:40:39 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:39 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec 06 09:40:39 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Dec 06 09:40:39 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:40:39 compute-1 ceph-mon[79770]: Deploying daemon crash.compute-2 on compute-2
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 2.f scrub starts
Dec 06 09:40:39 compute-1 ceph-mon[79770]: 2.f scrub ok
Dec 06 09:40:39 compute-1 ceph-mon[79770]: pgmap v99: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:39 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Dec 06 09:40:39 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Dec 06 09:40:40 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.a scrub starts
Dec 06 09:40:40 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.a scrub ok
Dec 06 09:40:41 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.b scrub starts
Dec 06 09:40:41 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.b scrub ok
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 4.14 scrub starts
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 4.14 scrub ok
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 5.14 scrub starts
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 5.14 scrub ok
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 4.1f scrub starts
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 4.1f scrub ok
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 5.17 scrub starts
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 5.17 scrub ok
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 5.1f scrub starts
Dec 06 09:40:41 compute-1 ceph-mon[79770]: pgmap v100: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 4.16 scrub starts
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 4.16 scrub ok
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 5.1f scrub ok
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 5.10 deep-scrub starts
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 5.10 deep-scrub ok
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 4.13 scrub starts
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 4.17 deep-scrub starts
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 4.17 deep-scrub ok
Dec 06 09:40:41 compute-1 ceph-mon[79770]: pgmap v101: 131 pgs: 1 active+clean+scrubbing, 1 active+clean+scrubbing+deep, 129 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 5.8 scrub starts
Dec 06 09:40:41 compute-1 ceph-mon[79770]: 5.8 scrub ok
Dec 06 09:40:41 compute-1 ceph-mon[79770]: Standby manager daemon compute-1.sauzid started
Dec 06 09:40:42 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.d scrub starts
Dec 06 09:40:42 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.d scrub ok
Dec 06 09:40:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e30 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:40:43 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.c scrub starts
Dec 06 09:40:43 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.c scrub ok
Dec 06 09:40:44 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.6 deep-scrub starts
Dec 06 09:40:44 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.6 deep-scrub ok
Dec 06 09:40:45 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Dec 06 09:40:45 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.7 scrub ok
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 4.13 scrub ok
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 5.15 scrub starts
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 5.15 scrub ok
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 5.a scrub starts
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 5.a scrub ok
Dec 06 09:40:45 compute-1 ceph-mon[79770]: pgmap v102: 131 pgs: 1 active+clean+scrubbing, 1 active+clean+scrubbing+deep, 129 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 5.11 scrub starts
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 5.11 scrub ok
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 4.b scrub starts
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 4.b scrub ok
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 5.16 scrub starts
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 5.16 scrub ok
Dec 06 09:40:45 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1940510154' entity='client.admin' 
Dec 06 09:40:45 compute-1 ceph-mon[79770]: mgrmap e11: compute-0.qhdjwa(active, since 2m), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:40:45 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr metadata", "who": "compute-1.sauzid", "id": "compute-1.sauzid"}]: dispatch
Dec 06 09:40:45 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 5.d scrub starts
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 5.d scrub ok
Dec 06 09:40:45 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:45 compute-1 ceph-mon[79770]: pgmap v103: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:45 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 4.9 scrub starts
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 4.9 scrub ok
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 5.c scrub starts
Dec 06 09:40:45 compute-1 ceph-mon[79770]: 5.c scrub ok
Dec 06 09:40:46 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.b scrub starts
Dec 06 09:40:46 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.b scrub ok
Dec 06 09:40:47 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.0 scrub starts
Dec 06 09:40:47 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.0 scrub ok
Dec 06 09:40:47 compute-1 ceph-mon[79770]: 4.15 scrub starts
Dec 06 09:40:47 compute-1 ceph-mon[79770]: 5.6 deep-scrub starts
Dec 06 09:40:47 compute-1 ceph-mon[79770]: 5.6 deep-scrub ok
Dec 06 09:40:47 compute-1 ceph-mon[79770]: 4.15 scrub ok
Dec 06 09:40:47 compute-1 ceph-mon[79770]: pgmap v104: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:47 compute-1 ceph-mon[79770]: 5.9 scrub starts
Dec 06 09:40:47 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:47 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:40:47 compute-1 ceph-mon[79770]: 5.9 scrub ok
Dec 06 09:40:47 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:40:47 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:40:47 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:40:47 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:40:47 compute-1 ceph-mon[79770]: 4.7 scrub starts
Dec 06 09:40:47 compute-1 ceph-mon[79770]: 4.7 scrub ok
Dec 06 09:40:47 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:47 compute-1 ceph-mon[79770]: from='client.14241 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 09:40:47 compute-1 ceph-mon[79770]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Dec 06 09:40:47 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:47 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e30 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:40:48 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.0 scrub starts
Dec 06 09:40:48 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.0 scrub ok
Dec 06 09:40:48 compute-1 systemd[72560]: Starting Mark boot as successful...
Dec 06 09:40:48 compute-1 systemd[72560]: Finished Mark boot as successful.
Dec 06 09:40:48 compute-1 ceph-mon[79770]: 4.8 scrub starts
Dec 06 09:40:48 compute-1 ceph-mon[79770]: Saving service ingress.rgw.default spec with placement count:2
Dec 06 09:40:48 compute-1 ceph-mon[79770]: 4.8 scrub ok
Dec 06 09:40:48 compute-1 ceph-mon[79770]: 5.b scrub starts
Dec 06 09:40:48 compute-1 ceph-mon[79770]: 5.b scrub ok
Dec 06 09:40:48 compute-1 ceph-mon[79770]: pgmap v105: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:48 compute-1 ceph-mon[79770]: 4.a scrub starts
Dec 06 09:40:48 compute-1 ceph-mon[79770]: 4.a scrub ok
Dec 06 09:40:48 compute-1 ceph-mon[79770]: 4.0 scrub starts
Dec 06 09:40:48 compute-1 ceph-mon[79770]: 4.0 scrub ok
Dec 06 09:40:48 compute-1 ceph-mon[79770]: 4.d scrub starts
Dec 06 09:40:48 compute-1 ceph-mon[79770]: 4.d scrub ok
Dec 06 09:40:48 compute-1 ceph-mon[79770]: pgmap v106: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:49 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e31 e31: 3 total, 2 up, 3 in
Dec 06 09:40:49 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.6 scrub starts
Dec 06 09:40:49 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.6 scrub ok
Dec 06 09:40:49 compute-1 ceph-mon[79770]: 5.0 scrub starts
Dec 06 09:40:49 compute-1 ceph-mon[79770]: 5.0 scrub ok
Dec 06 09:40:49 compute-1 ceph-mon[79770]: 4.5 scrub starts
Dec 06 09:40:49 compute-1 ceph-mon[79770]: 4.5 scrub ok
Dec 06 09:40:49 compute-1 ceph-mon[79770]: from='client.14247 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 09:40:49 compute-1 ceph-mon[79770]: Saving service node-exporter spec with placement *
Dec 06 09:40:49 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:49 compute-1 ceph-mon[79770]: Saving service grafana spec with placement compute-0;count:1
Dec 06 09:40:49 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:49 compute-1 ceph-mon[79770]: Saving service prometheus spec with placement compute-0;count:1
Dec 06 09:40:49 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:49 compute-1 ceph-mon[79770]: Saving service alertmanager spec with placement compute-0;count:1
Dec 06 09:40:49 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:49 compute-1 ceph-mon[79770]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "b46cc65b-25ba-490a-8b8e-91e4407f3aed"}]: dispatch
Dec 06 09:40:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/569971095' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "b46cc65b-25ba-490a-8b8e-91e4407f3aed"}]: dispatch
Dec 06 09:40:49 compute-1 ceph-mon[79770]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "b46cc65b-25ba-490a-8b8e-91e4407f3aed"}]': finished
Dec 06 09:40:49 compute-1 ceph-mon[79770]: osdmap e31: 3 total, 2 up, 3 in
Dec 06 09:40:49 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 06 09:40:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3771187413' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Dec 06 09:40:50 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.3 deep-scrub starts
Dec 06 09:40:50 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.3 deep-scrub ok
Dec 06 09:40:51 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.2 scrub starts
Dec 06 09:40:51 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.2 scrub ok
Dec 06 09:40:51 compute-1 ceph-mon[79770]: 3.6 scrub starts
Dec 06 09:40:51 compute-1 ceph-mon[79770]: 3.6 scrub ok
Dec 06 09:40:51 compute-1 ceph-mon[79770]: 5.4 scrub starts
Dec 06 09:40:51 compute-1 ceph-mon[79770]: 5.4 scrub ok
Dec 06 09:40:51 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4267326554' entity='client.admin' 
Dec 06 09:40:51 compute-1 ceph-mon[79770]: pgmap v108: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:52 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.6 deep-scrub starts
Dec 06 09:40:52 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.6 deep-scrub ok
Dec 06 09:40:52 compute-1 ceph-mon[79770]: 5.3 deep-scrub starts
Dec 06 09:40:52 compute-1 ceph-mon[79770]: 5.3 deep-scrub ok
Dec 06 09:40:52 compute-1 ceph-mon[79770]: 5.2 scrub starts
Dec 06 09:40:52 compute-1 ceph-mon[79770]: 5.2 scrub ok
Dec 06 09:40:52 compute-1 ceph-mon[79770]: 4.2 scrub starts
Dec 06 09:40:52 compute-1 ceph-mon[79770]: 4.2 scrub ok
Dec 06 09:40:52 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:52 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:52 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/821839877' entity='client.admin' 
Dec 06 09:40:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:40:53 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.2 scrub starts
Dec 06 09:40:53 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.2 scrub ok
Dec 06 09:40:53 compute-1 ceph-mon[79770]: 5.7 deep-scrub starts
Dec 06 09:40:53 compute-1 ceph-mon[79770]: 5.7 deep-scrub ok
Dec 06 09:40:53 compute-1 ceph-mon[79770]: 4.6 deep-scrub starts
Dec 06 09:40:53 compute-1 ceph-mon[79770]: 4.6 deep-scrub ok
Dec 06 09:40:53 compute-1 ceph-mon[79770]: pgmap v109: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:53 compute-1 ceph-mon[79770]: 5.1 scrub starts
Dec 06 09:40:53 compute-1 ceph-mon[79770]: 5.1 scrub ok
Dec 06 09:40:54 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Dec 06 09:40:54 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Dec 06 09:40:55 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Dec 06 09:40:55 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Dec 06 09:40:55 compute-1 ceph-mon[79770]: 3.2 scrub starts
Dec 06 09:40:55 compute-1 ceph-mon[79770]: 3.2 scrub ok
Dec 06 09:40:55 compute-1 ceph-mon[79770]: 4.e scrub starts
Dec 06 09:40:55 compute-1 ceph-mon[79770]: 4.e scrub ok
Dec 06 09:40:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1482144347' entity='client.admin' 
Dec 06 09:40:55 compute-1 ceph-mon[79770]: pgmap v110: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:55 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Dec 06 09:40:55 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:40:56 compute-1 sudo[80136]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpsoblzvokeispfxcspgielnmwixedhw ; /usr/bin/python3'
Dec 06 09:40:56 compute-1 sudo[80136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:40:56 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.3 scrub starts
Dec 06 09:40:56 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.3 scrub ok
Dec 06 09:40:56 compute-1 python3[80138]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc
                                           _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:40:56 compute-1 ceph-mon[79770]: 5.5 scrub starts
Dec 06 09:40:56 compute-1 ceph-mon[79770]: 5.5 scrub ok
Dec 06 09:40:56 compute-1 ceph-mon[79770]: Deploying daemon osd.2 on compute-2
Dec 06 09:40:56 compute-1 ceph-mon[79770]: 3.9 scrub starts
Dec 06 09:40:56 compute-1 ceph-mon[79770]: 3.9 scrub ok
Dec 06 09:40:56 compute-1 ceph-mon[79770]: 4.4 scrub starts
Dec 06 09:40:56 compute-1 ceph-mon[79770]: 4.4 scrub ok
Dec 06 09:40:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3512142115' entity='client.admin' 
Dec 06 09:40:56 compute-1 sudo[80136]: pam_unix(sudo:session): session closed for user root
Dec 06 09:40:57 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Dec 06 09:40:57 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1 scrub ok
Dec 06 09:40:57 compute-1 ceph-mon[79770]: 4.1 scrub starts
Dec 06 09:40:57 compute-1 ceph-mon[79770]: 4.1 scrub ok
Dec 06 09:40:57 compute-1 ceph-mon[79770]: 4.3 scrub starts
Dec 06 09:40:57 compute-1 ceph-mon[79770]: 4.3 scrub ok
Dec 06 09:40:57 compute-1 ceph-mon[79770]: pgmap v111: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:40:58 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Dec 06 09:40:58 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Dec 06 09:40:58 compute-1 ceph-mon[79770]: 3.5 scrub starts
Dec 06 09:40:58 compute-1 ceph-mon[79770]: 3.5 scrub ok
Dec 06 09:40:58 compute-1 ceph-mon[79770]: 3.1 scrub starts
Dec 06 09:40:58 compute-1 ceph-mon[79770]: 3.1 scrub ok
Dec 06 09:40:58 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2451230512' entity='client.admin' 
Dec 06 09:40:59 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Dec 06 09:40:59 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Dec 06 09:40:59 compute-1 ceph-mon[79770]: 4.c scrub starts
Dec 06 09:40:59 compute-1 ceph-mon[79770]: 4.c scrub ok
Dec 06 09:40:59 compute-1 ceph-mon[79770]: 3.8 scrub starts
Dec 06 09:40:59 compute-1 ceph-mon[79770]: 3.8 scrub ok
Dec 06 09:40:59 compute-1 ceph-mon[79770]: pgmap v112: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:40:59 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:40:59 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2111286861' entity='client.admin' 
Dec 06 09:40:59 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:00 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Dec 06 09:41:00 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Dec 06 09:41:00 compute-1 ceph-mon[79770]: 5.e scrub starts
Dec 06 09:41:00 compute-1 ceph-mon[79770]: 5.e scrub ok
Dec 06 09:41:00 compute-1 ceph-mon[79770]: 4.1d scrub starts
Dec 06 09:41:00 compute-1 ceph-mon[79770]: 4.1d scrub ok
Dec 06 09:41:00 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2854219236' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Dec 06 09:41:00 compute-1 sshd-session[80151]: Received disconnect from 222.88.225.195 port 56606:11:  [preauth]
Dec 06 09:41:00 compute-1 sshd-session[80151]: Disconnected from authenticating user root 222.88.225.195 port 56606 [preauth]
Dec 06 09:41:01 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Dec 06 09:41:01 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Dec 06 09:41:01 compute-1 sudo[80153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:41:01 compute-1 sudo[80153]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:01 compute-1 sudo[80153]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:01 compute-1 ceph-mon[79770]: 5.1b scrub starts
Dec 06 09:41:01 compute-1 ceph-mon[79770]: 5.1b scrub ok
Dec 06 09:41:01 compute-1 ceph-mon[79770]: 4.1c scrub starts
Dec 06 09:41:01 compute-1 ceph-mon[79770]: 4.1c scrub ok
Dec 06 09:41:01 compute-1 ceph-mon[79770]: pgmap v113: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:41:01 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2854219236' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Dec 06 09:41:01 compute-1 ceph-mon[79770]: mgrmap e12: compute-0.qhdjwa(active, since 2m), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:01 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:01 compute-1 ceph-mon[79770]: from='mgr.14122 192.168.122.100:0/2031792771' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:01 compute-1 anacron[4460]: Job `cron.weekly' started
Dec 06 09:41:01 compute-1 anacron[4460]: Job `cron.weekly' terminated
Dec 06 09:41:01 compute-1 sudo[80178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:41:01 compute-1 sudo[80178]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:01 compute-1 sudo[80178]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:01 compute-1 sudo[80205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 06 09:41:01 compute-1 sudo[80205]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:02 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Dec 06 09:41:02 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Dec 06 09:41:02 compute-1 podman[80301]: 2025-12-06 09:41:02.700428456 +0000 UTC m=+0.088899157 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 06 09:41:02 compute-1 ceph-mon[79770]: 4.1a deep-scrub starts
Dec 06 09:41:02 compute-1 ceph-mon[79770]: 4.1a deep-scrub ok
Dec 06 09:41:02 compute-1 ceph-mon[79770]: 5.1d scrub starts
Dec 06 09:41:02 compute-1 ceph-mon[79770]: 5.1d scrub ok
Dec 06 09:41:02 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2146703949' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr respawn  1: '-n'
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr respawn  2: 'mgr.compute-1.sauzid'
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr respawn  3: '-f'
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr respawn  4: '--setuser'
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr respawn  5: 'ceph'
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr respawn  6: '--setgroup'
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr respawn  7: 'ceph'
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr respawn  8: '--default-log-to-file=false'
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr respawn  9: '--default-log-to-journald=true'
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr respawn  10: '--default-log-to-stderr=false'
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr respawn  exe_path /proc/self/exe
Dec 06 09:41:02 compute-1 podman[80301]: 2025-12-06 09:41:02.80675084 +0000 UTC m=+0.195221571 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Dec 06 09:41:02 compute-1 sshd-session[72868]: Connection closed by 192.168.122.100 port 59142
Dec 06 09:41:02 compute-1 sshd-session[72812]: Connection closed by 192.168.122.100 port 59122
Dec 06 09:41:02 compute-1 sshd-session[72725]: Connection closed by 192.168.122.100 port 59092
Dec 06 09:41:02 compute-1 sshd-session[72783]: Connection closed by 192.168.122.100 port 59116
Dec 06 09:41:02 compute-1 sshd-session[72667]: Connection closed by 192.168.122.100 port 59076
Dec 06 09:41:02 compute-1 sshd-session[72754]: Connection closed by 192.168.122.100 port 59102
Dec 06 09:41:02 compute-1 sshd-session[72839]: Connection closed by 192.168.122.100 port 59134
Dec 06 09:41:02 compute-1 sshd-session[72580]: Connection closed by 192.168.122.100 port 59058
Dec 06 09:41:02 compute-1 sshd-session[72696]: Connection closed by 192.168.122.100 port 59082
Dec 06 09:41:02 compute-1 sshd-session[72638]: Connection closed by 192.168.122.100 port 59074
Dec 06 09:41:02 compute-1 sshd-session[72609]: Connection closed by 192.168.122.100 port 59060
Dec 06 09:41:02 compute-1 sshd-session[72579]: Connection closed by 192.168.122.100 port 59048
Dec 06 09:41:02 compute-1 sshd-session[72574]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 09:41:02 compute-1 sshd-session[72751]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 09:41:02 compute-1 sshd-session[72865]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 09:41:02 compute-1 systemd[1]: session-22.scope: Deactivated successfully.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Session 22 logged out. Waiting for processes to exit.
Dec 06 09:41:02 compute-1 systemd[1]: session-28.scope: Deactivated successfully.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Session 28 logged out. Waiting for processes to exit.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Removed session 22.
Dec 06 09:41:02 compute-1 sshd-session[72664]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 09:41:02 compute-1 systemd-logind[788]: Session 32 logged out. Waiting for processes to exit.
Dec 06 09:41:02 compute-1 sshd-session[72693]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 09:41:02 compute-1 systemd-logind[788]: Removed session 28.
Dec 06 09:41:02 compute-1 sshd-session[72556]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 09:41:02 compute-1 sshd-session[72780]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 09:41:02 compute-1 sshd-session[72809]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 09:41:02 compute-1 sshd-session[72836]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 09:41:02 compute-1 sshd-session[72722]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 09:41:02 compute-1 sshd-session[72606]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 09:41:02 compute-1 sshd-session[72635]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 09:41:02 compute-1 systemd-logind[788]: Session 30 logged out. Waiting for processes to exit.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Session 26 logged out. Waiting for processes to exit.
Dec 06 09:41:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:41:02 compute-1 systemd[1]: session-30.scope: Deactivated successfully.
Dec 06 09:41:02 compute-1 systemd[1]: session-26.scope: Deactivated successfully.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Removed session 30.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Removed session 26.
Dec 06 09:41:02 compute-1 systemd[1]: session-23.scope: Deactivated successfully.
Dec 06 09:41:02 compute-1 systemd[1]: session-25.scope: Deactivated successfully.
Dec 06 09:41:02 compute-1 systemd[1]: session-20.scope: Deactivated successfully.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Session 25 logged out. Waiting for processes to exit.
Dec 06 09:41:02 compute-1 systemd[1]: session-27.scope: Deactivated successfully.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Session 23 logged out. Waiting for processes to exit.
Dec 06 09:41:02 compute-1 systemd[1]: session-31.scope: Deactivated successfully.
Dec 06 09:41:02 compute-1 systemd[1]: session-24.scope: Deactivated successfully.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Session 20 logged out. Waiting for processes to exit.
Dec 06 09:41:02 compute-1 systemd[1]: session-29.scope: Deactivated successfully.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Session 27 logged out. Waiting for processes to exit.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Session 24 logged out. Waiting for processes to exit.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Session 31 logged out. Waiting for processes to exit.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Session 29 logged out. Waiting for processes to exit.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Removed session 23.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Removed session 25.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Removed session 20.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Removed session 27.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Removed session 31.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Removed session 24.
Dec 06 09:41:02 compute-1 systemd-logind[788]: Removed session 29.
Dec 06 09:41:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setuser ceph since I am not root
Dec 06 09:41:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setgroup ceph since I am not root
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: pidfile_write: ignore empty --pid-file
Dec 06 09:41:02 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'alerts'
Dec 06 09:41:03 compute-1 ceph-mgr[80080]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 09:41:03 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'balancer'
Dec 06 09:41:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:03.032+0000 7f880081b140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 09:41:03 compute-1 ceph-mgr[80080]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 09:41:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:03.117+0000 7f880081b140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 09:41:03 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'cephadm'
Dec 06 09:41:03 compute-1 sudo[80205]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:03 compute-1 systemd[1]: session-32.scope: Deactivated successfully.
Dec 06 09:41:03 compute-1 systemd[1]: session-32.scope: Consumed 1min 26.086s CPU time.
Dec 06 09:41:03 compute-1 systemd-logind[788]: Removed session 32.
Dec 06 09:41:03 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.19 deep-scrub starts
Dec 06 09:41:03 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.19 deep-scrub ok
Dec 06 09:41:03 compute-1 ceph-mon[79770]: 4.1b scrub starts
Dec 06 09:41:03 compute-1 ceph-mon[79770]: 4.1b scrub ok
Dec 06 09:41:03 compute-1 ceph-mon[79770]: 3.1b scrub starts
Dec 06 09:41:03 compute-1 ceph-mon[79770]: 3.1b scrub ok
Dec 06 09:41:03 compute-1 ceph-mon[79770]: pgmap v114: 131 pgs: 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:41:03 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2146703949' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Dec 06 09:41:03 compute-1 ceph-mon[79770]: mgrmap e13: compute-0.qhdjwa(active, since 3m), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:03 compute-1 ceph-mon[79770]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec 06 09:41:03 compute-1 ceph-mon[79770]: from='osd.2 [v2:192.168.122.102:6800/709563040,v1:192.168.122.102:6801/709563040]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Dec 06 09:41:03 compute-1 ceph-mon[79770]: 5.f deep-scrub starts
Dec 06 09:41:03 compute-1 ceph-mon[79770]: 5.f deep-scrub ok
Dec 06 09:41:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e32 e32: 3 total, 2 up, 3 in
Dec 06 09:41:03 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'crash'
Dec 06 09:41:04 compute-1 ceph-mgr[80080]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 09:41:04 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'dashboard'
Dec 06 09:41:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:04.013+0000 7f880081b140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 09:41:04 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1f scrub starts
Dec 06 09:41:04 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1f scrub ok
Dec 06 09:41:04 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'devicehealth'
Dec 06 09:41:04 compute-1 ceph-mgr[80080]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 09:41:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:04.731+0000 7f880081b140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 09:41:04 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'diskprediction_local'
Dec 06 09:41:04 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e33 e33: 3 total, 2 up, 3 in
Dec 06 09:41:04 compute-1 ceph-mon[79770]: 4.19 deep-scrub starts
Dec 06 09:41:04 compute-1 ceph-mon[79770]: 4.19 deep-scrub ok
Dec 06 09:41:04 compute-1 ceph-mon[79770]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Dec 06 09:41:04 compute-1 ceph-mon[79770]: osdmap e32: 3 total, 2 up, 3 in
Dec 06 09:41:04 compute-1 ceph-mon[79770]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Dec 06 09:41:04 compute-1 ceph-mon[79770]: from='osd.2 [v2:192.168.122.102:6800/709563040,v1:192.168.122.102:6801/709563040]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Dec 06 09:41:04 compute-1 ceph-mon[79770]: 3.1c scrub starts
Dec 06 09:41:04 compute-1 ceph-mon[79770]: 3.1c scrub ok
Dec 06 09:41:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 06 09:41:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 06 09:41:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]:   from numpy import show_config as show_numpy_config
Dec 06 09:41:04 compute-1 ceph-mgr[80080]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 09:41:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:04.919+0000 7f880081b140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 09:41:04 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'influx'
Dec 06 09:41:04 compute-1 ceph-mgr[80080]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 09:41:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:04.996+0000 7f880081b140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 09:41:04 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'insights'
Dec 06 09:41:05 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'iostat'
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.13( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.176478386s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 active pruub 99.331306458s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.13( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.176478386s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331306458s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.15( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.811074257s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active pruub 99.966171265s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.15( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.811074257s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966171265s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.12( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.176123619s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 active pruub 99.331329346s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.12( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.176123619s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331329346s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.14( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.052652359s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 104.207954407s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.14( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.052652359s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.207954407s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.13( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.810589790s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active pruub 99.965995789s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.13( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.810589790s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965995789s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.10( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.810255051s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active pruub 99.966018677s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.10( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.810255051s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966018677s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.8( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175810814s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 active pruub 99.331695557s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.c( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.810143471s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active pruub 99.966064453s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.c( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.810143471s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966064453s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.8( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175810814s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331695557s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175298691s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 active pruub 99.331344604s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175298691s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331344604s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.d( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.809710503s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active pruub 99.965881348s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.a( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.809786797s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active pruub 99.965988159s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.a( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.809786797s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965988159s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.d( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.809710503s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965881348s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.d( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175025940s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 active pruub 99.331336975s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.d( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175025940s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331336975s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[3.0( empty local-lis/les=24/25 n=0 ec=15/15 lis/c=24/24 les/c/f=25/25/0 sis=33 pruub=14.045259476s) [] r=-1 lpr=33 pi=[24,33)/1 crt=0'0 mlcod 0'0 active pruub 103.201667786s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[3.0( empty local-lis/les=24/25 n=0 ec=15/15 lis/c=24/24 les/c/f=25/25/0 sis=33 pruub=14.045259476s) [] r=-1 lpr=33 pi=[24,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.201667786s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.0( empty local-lis/les=27/28 n=0 ec=19/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175087929s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 active pruub 99.331726074s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[5.0( empty local-lis/les=27/28 n=0 ec=19/19 lis/c=27/27 les/c/f=28/28/0 sis=33 pruub=10.175087929s) [] r=-1 lpr=33 pi=[27,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331726074s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.2( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051401138s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 104.208145142s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.2( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051401138s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208145142s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.6( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051333427s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 104.208114624s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.6( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051333427s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208114624s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.3( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051722527s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 104.208694458s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.3( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051722527s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208694458s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[3.8( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=33 pruub=14.051517487s) [] r=-1 lpr=33 pi=[24,33)/1 crt=0'0 mlcod 0'0 active pruub 103.208557129s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[3.8( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=33 pruub=14.051517487s) [] r=-1 lpr=33 pi=[24,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.208557129s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.1d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051094055s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 104.208251953s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.1c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.053232193s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 104.210411072s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.1d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.051094055s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208251953s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.1c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.053232193s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.210411072s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[3.1b( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=33 pruub=14.051561356s) [] r=-1 lpr=33 pi=[24,33)/1 crt=0'0 mlcod 0'0 active pruub 103.208839417s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[3.1b( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=33 pruub=14.051561356s) [] r=-1 lpr=33 pi=[24,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.208839417s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.19( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.053050041s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 active pruub 104.210380554s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[4.19( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=33 pruub=15.053050041s) [] r=-1 lpr=33 pi=[25,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.210380554s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.1b( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.808290482s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 active pruub 99.965744019s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 33 pg[2.1b( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=33 pruub=10.808290482s) [] r=-1 lpr=33 pi=[29,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965744019s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:05 compute-1 ceph-mgr[80080]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 09:41:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:05.132+0000 7f880081b140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 09:41:05 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'k8sevents'
Dec 06 09:41:05 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1e deep-scrub starts
Dec 06 09:41:05 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 3.1e deep-scrub ok
Dec 06 09:41:05 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'localpool'
Dec 06 09:41:05 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'mds_autoscaler'
Dec 06 09:41:05 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'mirroring'
Dec 06 09:41:05 compute-1 ceph-mon[79770]: purged_snaps scrub starts
Dec 06 09:41:05 compute-1 ceph-mon[79770]: purged_snaps scrub ok
Dec 06 09:41:05 compute-1 ceph-mon[79770]: 3.1f scrub starts
Dec 06 09:41:05 compute-1 ceph-mon[79770]: 3.1f scrub ok
Dec 06 09:41:05 compute-1 ceph-mon[79770]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Dec 06 09:41:05 compute-1 ceph-mon[79770]: osdmap e33: 3 total, 2 up, 3 in
Dec 06 09:41:05 compute-1 ceph-mon[79770]: 5.18 scrub starts
Dec 06 09:41:05 compute-1 ceph-mon[79770]: 5.18 scrub ok
Dec 06 09:41:05 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'nfs'
Dec 06 09:41:06 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.f scrub starts
Dec 06 09:41:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:06.271+0000 7f880081b140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 09:41:06 compute-1 ceph-mgr[80080]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 09:41:06 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'orchestrator'
Dec 06 09:41:06 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 4.f scrub ok
Dec 06 09:41:06 compute-1 ceph-mgr[80080]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 09:41:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:06.505+0000 7f880081b140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 09:41:06 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'osd_perf_query'
Dec 06 09:41:06 compute-1 ceph-mgr[80080]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 09:41:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:06.589+0000 7f880081b140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 09:41:06 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'osd_support'
Dec 06 09:41:06 compute-1 ceph-mgr[80080]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 09:41:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:06.662+0000 7f880081b140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 09:41:06 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'pg_autoscaler'
Dec 06 09:41:06 compute-1 ceph-mgr[80080]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 09:41:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:06.753+0000 7f880081b140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 09:41:06 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'progress'
Dec 06 09:41:06 compute-1 ceph-mgr[80080]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 09:41:06 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'prometheus'
Dec 06 09:41:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:06.850+0000 7f880081b140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 09:41:06 compute-1 ceph-mon[79770]: 3.1e deep-scrub starts
Dec 06 09:41:06 compute-1 ceph-mon[79770]: 3.1e deep-scrub ok
Dec 06 09:41:06 compute-1 ceph-mon[79770]: 4.18 scrub starts
Dec 06 09:41:06 compute-1 ceph-mon[79770]: 4.18 scrub ok
Dec 06 09:41:07 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.19 scrub starts
Dec 06 09:41:07 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 5.19 scrub ok
Dec 06 09:41:07 compute-1 ceph-mgr[80080]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 09:41:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:07.264+0000 7f880081b140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 09:41:07 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'rbd_support'
Dec 06 09:41:07 compute-1 ceph-mgr[80080]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 09:41:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:07.361+0000 7f880081b140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 09:41:07 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'restful'
Dec 06 09:41:07 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'rgw'
Dec 06 09:41:07 compute-1 ceph-mgr[80080]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 09:41:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:07.853+0000 7f880081b140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 09:41:07 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'rook'
Dec 06 09:41:07 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:41:07 compute-1 ceph-mon[79770]: 4.f scrub starts
Dec 06 09:41:07 compute-1 ceph-mon[79770]: 4.f scrub ok
Dec 06 09:41:07 compute-1 ceph-mon[79770]: 3.3 scrub starts
Dec 06 09:41:07 compute-1 ceph-mon[79770]: 3.3 scrub ok
Dec 06 09:41:08 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Dec 06 09:41:08 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Dec 06 09:41:08 compute-1 ceph-mgr[80080]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 09:41:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:08.517+0000 7f880081b140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 09:41:08 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'selftest'
Dec 06 09:41:08 compute-1 ceph-mgr[80080]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 09:41:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:08.597+0000 7f880081b140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 09:41:08 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'snap_schedule'
Dec 06 09:41:08 compute-1 ceph-mgr[80080]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 06 09:41:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:08.697+0000 7f880081b140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 06 09:41:08 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'stats'
Dec 06 09:41:08 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'status'
Dec 06 09:41:08 compute-1 ceph-mgr[80080]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 09:41:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:08.872+0000 7f880081b140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 09:41:08 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'telegraf'
Dec 06 09:41:08 compute-1 ceph-mon[79770]: 5.19 scrub starts
Dec 06 09:41:08 compute-1 ceph-mon[79770]: 5.19 scrub ok
Dec 06 09:41:08 compute-1 ceph-mon[79770]: 5.1c scrub starts
Dec 06 09:41:08 compute-1 ceph-mon[79770]: 5.1c scrub ok
Dec 06 09:41:08 compute-1 ceph-mon[79770]: Standby manager daemon compute-2.oazbvn restarted
Dec 06 09:41:08 compute-1 ceph-mon[79770]: Standby manager daemon compute-2.oazbvn started
Dec 06 09:41:08 compute-1 ceph-mgr[80080]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 09:41:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:08.971+0000 7f880081b140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 09:41:08 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'telemetry'
Dec 06 09:41:09 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e34 e34: 3 total, 2 up, 3 in
Dec 06 09:41:09 compute-1 ceph-mgr[80080]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 09:41:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:09.343+0000 7f880081b140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 09:41:09 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'test_orchestrator'
Dec 06 09:41:09 compute-1 ceph-mgr[80080]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 09:41:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:09.677+0000 7f880081b140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 09:41:09 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'volumes'
Dec 06 09:41:09 compute-1 sshd-session[80417]: Accepted publickey for ceph-admin from 192.168.122.100 port 38444 ssh2: RSA SHA256:Gxeh0g0CuyN5zOpDUv+8o0JynyC1ASnaMny1857KGxo
Dec 06 09:41:09 compute-1 systemd-logind[788]: New session 33 of user ceph-admin.
Dec 06 09:41:09 compute-1 systemd[1]: Started Session 33 of User ceph-admin.
Dec 06 09:41:09 compute-1 sshd-session[80417]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:41:09 compute-1 sudo[80421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:41:09 compute-1 sudo[80421]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:09 compute-1 sudo[80421]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:09 compute-1 sudo[80446]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 06 09:41:10 compute-1 sudo[80446]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:10 compute-1 ceph-mon[79770]: 2.19 scrub starts
Dec 06 09:41:10 compute-1 ceph-mon[79770]: 2.19 scrub ok
Dec 06 09:41:10 compute-1 ceph-mon[79770]: mgrmap e14: compute-0.qhdjwa(active, since 3m), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:10 compute-1 ceph-mon[79770]: Active manager daemon compute-0.qhdjwa restarted
Dec 06 09:41:10 compute-1 ceph-mon[79770]: Activating manager daemon compute-0.qhdjwa
Dec 06 09:41:10 compute-1 ceph-mon[79770]: osdmap e34: 3 total, 2 up, 3 in
Dec 06 09:41:10 compute-1 ceph-mon[79770]: mgrmap e15: compute-0.qhdjwa(active, starting, since 0.0389693s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:10 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 06 09:41:10 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:41:10 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 06 09:41:10 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr metadata", "who": "compute-0.qhdjwa", "id": "compute-0.qhdjwa"}]: dispatch
Dec 06 09:41:10 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr metadata", "who": "compute-1.sauzid", "id": "compute-1.sauzid"}]: dispatch
Dec 06 09:41:10 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr metadata", "who": "compute-2.oazbvn", "id": "compute-2.oazbvn"}]: dispatch
Dec 06 09:41:10 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 06 09:41:10 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 06 09:41:10 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 06 09:41:10 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mds metadata"}]: dispatch
Dec 06 09:41:10 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 06 09:41:10 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata"}]: dispatch
Dec 06 09:41:10 compute-1 ceph-mon[79770]: Manager daemon compute-0.qhdjwa is now available
Dec 06 09:41:10 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/mirror_snapshot_schedule"}]: dispatch
Dec 06 09:41:10 compute-1 ceph-mgr[80080]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 09:41:10 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'zabbix'
Dec 06 09:41:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:10.056+0000 7f880081b140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 09:41:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:10.176+0000 7f880081b140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 09:41:10 compute-1 ceph-mgr[80080]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 09:41:10 compute-1 ceph-mgr[80080]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 09:41:10 compute-1 ceph-mgr[80080]: mgr load Constructed class from module: dashboard
Dec 06 09:41:10 compute-1 ceph-mgr[80080]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Dec 06 09:41:10 compute-1 ceph-mgr[80080]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec 06 09:41:10 compute-1 ceph-mgr[80080]: [dashboard INFO root] Starting engine...
Dec 06 09:41:10 compute-1 ceph-mgr[80080]: ms_deliver_dispatch: unhandled message 0x55cc3d915860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Dec 06 09:41:10 compute-1 ceph-mgr[80080]: [dashboard INFO root] Engine started...
Dec 06 09:41:11 compute-1 podman[80554]: 2025-12-06 09:41:11.109317284 +0000 UTC m=+0.090581227 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 06 09:41:11 compute-1 podman[80554]: 2025-12-06 09:41:11.242774691 +0000 UTC m=+0.224038594 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Dec 06 09:41:11 compute-1 ceph-mon[79770]: mgrmap e16: compute-0.qhdjwa(active, since 1.10991s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:11 compute-1 ceph-mon[79770]: from='client.14313 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-username", "value": "admin", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 09:41:11 compute-1 ceph-mon[79770]: pgmap v3: 131 pgs: 87 active+clean, 44 unknown; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:41:11 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 06 09:41:11 compute-1 ceph-mon[79770]: Standby manager daemon compute-1.sauzid restarted
Dec 06 09:41:11 compute-1 ceph-mon[79770]: Standby manager daemon compute-1.sauzid started
Dec 06 09:41:11 compute-1 ceph-mon[79770]: OSD bench result of 3012.211775 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Dec 06 09:41:11 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:11 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:11 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:11 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e35 e35: 3 total, 3 up, 3 in
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.13( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.997880459s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331306458s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.15( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.632750034s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966171265s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.12( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.997904301s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331329346s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.13( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.997353077s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331306458s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.15( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.632209778s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966171265s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.12( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.997376680s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331329346s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.8( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.996918201s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331695557s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.8( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.996894836s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331695557s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.c( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.631101608s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966064453s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.10( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.631021500s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966018677s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.c( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.631088734s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966064453s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.10( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.630993366s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.966018677s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.996212721s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331344604s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.b( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.996188402s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331344604s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.d( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.630615711s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965881348s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.d( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.630573750s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965881348s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.13( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.630668163s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965995789s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.d( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.995894432s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331336975s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.d( empty local-lis/les=27/28 n=0 ec=27/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.995878220s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331336975s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.a( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.630395412s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965988159s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.13( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.630543232s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965995789s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[3.0( empty local-lis/les=24/25 n=0 ec=15/15 lis/c=24/24 les/c/f=25/25/0 sis=35 pruub=7.865968227s) [2] r=-1 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.201667786s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.a( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.630350113s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965988159s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[3.0( empty local-lis/les=24/25 n=0 ec=15/15 lis/c=24/24 les/c/f=25/25/0 sis=35 pruub=7.865921497s) [2] r=-1 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.201667786s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.0( empty local-lis/les=27/28 n=0 ec=19/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.995961189s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331726074s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[5.0( empty local-lis/les=27/28 n=0 ec=19/19 lis/c=27/27 les/c/f=28/28/0 sis=35 pruub=3.995939732s) [2] r=-1 lpr=35 pi=[27,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.331726074s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.6( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.872111320s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208114624s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.6( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.872088432s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208114624s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[3.8( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=35 pruub=7.872457027s) [2] r=-1 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.208557129s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.2( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.871963501s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208145142s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[3.8( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=35 pruub=7.872427464s) [2] r=-1 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.208557129s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.2( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.871884346s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208145142s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.1b( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.629342079s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965744019s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[2.1b( empty local-lis/les=29/30 n=0 ec=24/13 lis/c=29/29 les/c/f=30/30/0 sis=35 pruub=4.629287243s) [2] r=-1 lpr=35 pi=[29,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 99.965744019s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.1d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.871744156s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208251953s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.3( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.871904373s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208694458s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.3( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.871889114s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208694458s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.1c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.873806000s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.210411072s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.1d( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.871715546s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.208251953s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.1c( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.873502731s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.210411072s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.19( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.873318672s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.210380554s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.19( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.873299599s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.210380554s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[3.1b( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=35 pruub=7.871504784s) [2] r=-1 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.208839417s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[3.1b( empty local-lis/les=24/25 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=35 pruub=7.871484756s) [2] r=-1 lpr=35 pi=[24,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 103.208839417s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.14( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.870471954s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.207954407s@ mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:41:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 35 pg[4.14( empty local-lis/les=25/26 n=0 ec=25/17 lis/c=25/25 les/c/f=26/26/0 sis=35 pruub=8.870453835s) [2] r=-1 lpr=35 pi=[25,35)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 104.207954407s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:41:11 compute-1 sudo[80446]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:11 compute-1 sudo[80641]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:41:11 compute-1 sudo[80641]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:11 compute-1 sudo[80641]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:11 compute-1 sudo[80666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:41:11 compute-1 sudo[80666]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:12 compute-1 ceph-mon[79770]: [06/Dec/2025:09:41:11] ENGINE Bus STARTING
Dec 06 09:41:12 compute-1 ceph-mon[79770]: from='client.14337 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-password", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 09:41:12 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 06 09:41:12 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:12 compute-1 ceph-mon[79770]: [06/Dec/2025:09:41:11] ENGINE Serving on https://192.168.122.100:7150
Dec 06 09:41:12 compute-1 ceph-mon[79770]: [06/Dec/2025:09:41:11] ENGINE Client ('192.168.122.100', 54474) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 09:41:12 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:12 compute-1 ceph-mon[79770]: mgrmap e17: compute-0.qhdjwa(active, since 2s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:12 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:12 compute-1 ceph-mon[79770]: osd.2 [v2:192.168.122.102:6800/709563040,v1:192.168.122.102:6801/709563040] boot
Dec 06 09:41:12 compute-1 ceph-mon[79770]: osdmap e35: 3 total, 3 up, 3 in
Dec 06 09:41:12 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 06 09:41:12 compute-1 ceph-mon[79770]: [06/Dec/2025:09:41:11] ENGINE Serving on http://192.168.122.100:8765
Dec 06 09:41:12 compute-1 ceph-mon[79770]: [06/Dec/2025:09:41:11] ENGINE Bus STARTED
Dec 06 09:41:12 compute-1 ceph-mon[79770]: pgmap v5: 131 pgs: 87 active+clean, 44 unknown; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Dec 06 09:41:12 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:12 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:12 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e36 e36: 3 total, 3 up, 3 in
Dec 06 09:41:12 compute-1 sudo[80666]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:12 compute-1 sudo[80722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:41:12 compute-1 sudo[80722]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:12 compute-1 sudo[80722]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:12 compute-1 sudo[80747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Dec 06 09:41:12 compute-1 sudo[80747]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:12 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:41:12 compute-1 sudo[80747]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:13 compute-1 ceph-mon[79770]: from='client.14349 -' entity='client.admin' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://192.168.122.100:9093", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 09:41:13 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:13 compute-1 ceph-mon[79770]: osdmap e36: 3 total, 3 up, 3 in
Dec 06 09:41:13 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/trash_purge_schedule"}]: dispatch
Dec 06 09:41:13 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:13 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:13 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:13 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:13 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Dec 06 09:41:13 compute-1 ceph-mon[79770]: Adjusting osd_memory_target on compute-0 to 128.0M
Dec 06 09:41:13 compute-1 ceph-mon[79770]: Unable to set osd_memory_target on compute-0 to 134217728: error parsing value: Value '134217728' is below minimum 939524096
Dec 06 09:41:13 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:13 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:13 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec 06 09:41:13 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:14 compute-1 ceph-mon[79770]: 5.1a scrub starts
Dec 06 09:41:14 compute-1 ceph-mon[79770]: 5.1a scrub ok
Dec 06 09:41:14 compute-1 ceph-mon[79770]: Adjusting osd_memory_target on compute-1 to 127.9M
Dec 06 09:41:14 compute-1 ceph-mon[79770]: Unable to set osd_memory_target on compute-1 to 134211993: error parsing value: Value '134211993' is below minimum 939524096
Dec 06 09:41:14 compute-1 ceph-mon[79770]: from='client.14355 -' entity='client.admin' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://192.168.122.100:9092", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 09:41:14 compute-1 ceph-mon[79770]: pgmap v7: 131 pgs: 44 peering, 87 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:41:14 compute-1 ceph-mon[79770]: 2.13 scrub starts
Dec 06 09:41:14 compute-1 ceph-mon[79770]: mgrmap e18: compute-0.qhdjwa(active, since 4s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:14 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:14 compute-1 sudo[80790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 09:41:14 compute-1 sudo[80790]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:14 compute-1 sudo[80790]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:14 compute-1 sudo[80815]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph
Dec 06 09:41:14 compute-1 sudo[80815]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:14 compute-1 sudo[80815]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:14 compute-1 sudo[80840]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:41:14 compute-1 sudo[80840]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:14 compute-1 sudo[80840]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:14 compute-1 sudo[80865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:41:14 compute-1 sudo[80865]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:14 compute-1 sudo[80865]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:14 compute-1 sudo[80890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:41:14 compute-1 sudo[80890]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:14 compute-1 sudo[80890]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:14 compute-1 sudo[80938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:41:14 compute-1 sudo[80938]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:14 compute-1 sudo[80938]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:14 compute-1 sudo[80963]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:41:14 compute-1 sudo[80963]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:14 compute-1 sudo[80963]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:15 compute-1 sudo[80988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 09:41:15 compute-1 sudo[80988]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:15 compute-1 sudo[80988]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:15 compute-1 sudo[81013]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config
Dec 06 09:41:15 compute-1 sudo[81013]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:15 compute-1 sudo[81013]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:15 compute-1 sudo[81038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config
Dec 06 09:41:15 compute-1 sudo[81038]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:15 compute-1 sudo[81038]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:15 compute-1 sudo[81063]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:41:15 compute-1 sudo[81063]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:15 compute-1 sudo[81063]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:15 compute-1 ceph-mon[79770]: 2.13 scrub ok
Dec 06 09:41:15 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:15 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec 06 09:41:15 compute-1 ceph-mon[79770]: Adjusting osd_memory_target on compute-2 to 127.9M
Dec 06 09:41:15 compute-1 ceph-mon[79770]: Unable to set osd_memory_target on compute-2 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Dec 06 09:41:15 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:41:15 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:41:15 compute-1 ceph-mon[79770]: Updating compute-0:/etc/ceph/ceph.conf
Dec 06 09:41:15 compute-1 ceph-mon[79770]: Updating compute-1:/etc/ceph/ceph.conf
Dec 06 09:41:15 compute-1 ceph-mon[79770]: Updating compute-2:/etc/ceph/ceph.conf
Dec 06 09:41:15 compute-1 ceph-mon[79770]: from='client.24160 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "http://192.168.122.100:3100", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 09:41:15 compute-1 ceph-mon[79770]: from='mgr.24116 192.168.122.100:0/4088948354' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:15 compute-1 sudo[81088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:41:15 compute-1 sudo[81088]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:15 compute-1 sudo[81088]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:15 compute-1 sudo[81113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:41:15 compute-1 sudo[81113]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:15 compute-1 sudo[81113]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:15 compute-1 sudo[81161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:41:15 compute-1 sudo[81161]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:15 compute-1 sudo[81161]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:15 compute-1 sudo[81186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:41:15 compute-1 sudo[81186]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:15 compute-1 sudo[81186]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:15 compute-1 sudo[81211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec 06 09:41:15 compute-1 sudo[81211]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:15 compute-1 sudo[81211]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:15 compute-1 sudo[81236]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 09:41:15 compute-1 sudo[81236]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:15 compute-1 sudo[81236]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:15 compute-1 sudo[81261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph
Dec 06 09:41:15 compute-1 sudo[81261]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:15 compute-1 sudo[81261]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:15 compute-1 sudo[81286]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:41:15 compute-1 sudo[81286]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:16 compute-1 sudo[81286]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:16 compute-1 sudo[81311]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:41:16 compute-1 sudo[81311]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:16 compute-1 sudo[81311]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:16 compute-1 sudo[81336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:41:16 compute-1 sudo[81336]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:16 compute-1 sudo[81336]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:16 compute-1 sudo[81384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:41:16 compute-1 sudo[81384]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:16 compute-1 sudo[81384]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:16 compute-1 ceph-mon[79770]: 2.d scrub starts
Dec 06 09:41:16 compute-1 ceph-mon[79770]: 2.d scrub ok
Dec 06 09:41:16 compute-1 ceph-mon[79770]: Updating compute-0:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec 06 09:41:16 compute-1 ceph-mon[79770]: Updating compute-1:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec 06 09:41:16 compute-1 ceph-mon[79770]: Updating compute-2:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec 06 09:41:16 compute-1 ceph-mon[79770]: pgmap v8: 131 pgs: 44 peering, 87 active+clean; 449 KiB data, 480 MiB used, 60 GiB / 60 GiB avail; 34 KiB/s rd, 0 B/s wr, 14 op/s
Dec 06 09:41:16 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/986641805' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Dec 06 09:41:16 compute-1 ceph-mon[79770]: 2.10 scrub starts
Dec 06 09:41:16 compute-1 ceph-mon[79770]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec 06 09:41:16 compute-1 ceph-mon[79770]: 2.10 scrub ok
Dec 06 09:41:16 compute-1 ceph-mon[79770]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec 06 09:41:16 compute-1 ceph-mon[79770]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr respawn  1: '-n'
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr respawn  2: 'mgr.compute-1.sauzid'
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr respawn  3: '-f'
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr respawn  4: '--setuser'
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr respawn  5: 'ceph'
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr respawn  6: '--setgroup'
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr respawn  7: 'ceph'
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr respawn  8: '--default-log-to-file=false'
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr respawn  9: '--default-log-to-journald=true'
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr respawn  10: '--default-log-to-stderr=false'
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr respawn  exe_path /proc/self/exe
Dec 06 09:41:16 compute-1 sudo[81409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:41:16 compute-1 sudo[81409]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:16 compute-1 sudo[81409]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:16 compute-1 sshd-session[80420]: Read error from remote host 192.168.122.100 port 38444: Connection reset by peer
Dec 06 09:41:16 compute-1 sshd-session[80417]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 09:41:16 compute-1 systemd[1]: session-33.scope: Deactivated successfully.
Dec 06 09:41:16 compute-1 systemd[1]: session-33.scope: Consumed 5.190s CPU time.
Dec 06 09:41:16 compute-1 systemd-logind[788]: Session 33 logged out. Waiting for processes to exit.
Dec 06 09:41:16 compute-1 systemd-logind[788]: Removed session 33.
Dec 06 09:41:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setuser ceph since I am not root
Dec 06 09:41:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setgroup ceph since I am not root
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: pidfile_write: ignore empty --pid-file
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'alerts'
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 09:41:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:16.710+0000 7fecc40b7140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'balancer'
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 09:41:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:16.789+0000 7fecc40b7140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 09:41:16 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'cephadm'
Dec 06 09:41:17 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/986641805' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Dec 06 09:41:17 compute-1 ceph-mon[79770]: mgrmap e19: compute-0.qhdjwa(active, since 7s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:17 compute-1 ceph-mon[79770]: 3.1d scrub starts
Dec 06 09:41:17 compute-1 ceph-mon[79770]: 3.1d scrub ok
Dec 06 09:41:17 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2772325777' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Dec 06 09:41:17 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'crash'
Dec 06 09:41:17 compute-1 ceph-mgr[80080]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 09:41:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:17.713+0000 7fecc40b7140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 09:41:17 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'dashboard'
Dec 06 09:41:17 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:41:18 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'devicehealth'
Dec 06 09:41:18 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2772325777' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Dec 06 09:41:18 compute-1 ceph-mon[79770]: mgrmap e20: compute-0.qhdjwa(active, since 8s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:18 compute-1 ceph-mon[79770]: 2.15 scrub starts
Dec 06 09:41:18 compute-1 ceph-mon[79770]: 2.15 scrub ok
Dec 06 09:41:18 compute-1 ceph-mgr[80080]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 09:41:18 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'diskprediction_local'
Dec 06 09:41:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:18.413+0000 7fecc40b7140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 09:41:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 06 09:41:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 06 09:41:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]:   from numpy import show_config as show_numpy_config
Dec 06 09:41:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:18.600+0000 7fecc40b7140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 09:41:18 compute-1 ceph-mgr[80080]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 09:41:18 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'influx'
Dec 06 09:41:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:18.676+0000 7fecc40b7140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 09:41:18 compute-1 ceph-mgr[80080]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 09:41:18 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'insights'
Dec 06 09:41:18 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'iostat'
Dec 06 09:41:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:18.866+0000 7fecc40b7140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 09:41:18 compute-1 ceph-mgr[80080]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 09:41:18 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'k8sevents'
Dec 06 09:41:19 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'localpool'
Dec 06 09:41:19 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'mds_autoscaler'
Dec 06 09:41:19 compute-1 ceph-mon[79770]: 3.1a scrub starts
Dec 06 09:41:19 compute-1 ceph-mon[79770]: 3.1a scrub ok
Dec 06 09:41:19 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'mirroring'
Dec 06 09:41:19 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'nfs'
Dec 06 09:41:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:20.006+0000 7fecc40b7140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 09:41:20 compute-1 ceph-mgr[80080]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 09:41:20 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'orchestrator'
Dec 06 09:41:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:20.248+0000 7fecc40b7140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 09:41:20 compute-1 ceph-mgr[80080]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 09:41:20 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'osd_perf_query'
Dec 06 09:41:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:20.334+0000 7fecc40b7140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 09:41:20 compute-1 ceph-mgr[80080]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 09:41:20 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'osd_support'
Dec 06 09:41:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:20.404+0000 7fecc40b7140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 09:41:20 compute-1 ceph-mgr[80080]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 09:41:20 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'pg_autoscaler'
Dec 06 09:41:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:20.485+0000 7fecc40b7140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 09:41:20 compute-1 ceph-mgr[80080]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 09:41:20 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'progress'
Dec 06 09:41:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:20.570+0000 7fecc40b7140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 09:41:20 compute-1 ceph-mgr[80080]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 09:41:20 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'prometheus'
Dec 06 09:41:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:20.927+0000 7fecc40b7140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 09:41:20 compute-1 ceph-mgr[80080]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 09:41:20 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'rbd_support'
Dec 06 09:41:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:21.031+0000 7fecc40b7140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 09:41:21 compute-1 ceph-mgr[80080]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 09:41:21 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'restful'
Dec 06 09:41:21 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'rgw'
Dec 06 09:41:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:21.500+0000 7fecc40b7140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 09:41:21 compute-1 ceph-mgr[80080]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 09:41:21 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'rook'
Dec 06 09:41:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:22.130+0000 7fecc40b7140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 09:41:22 compute-1 ceph-mgr[80080]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 09:41:22 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'selftest'
Dec 06 09:41:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:22.202+0000 7fecc40b7140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 09:41:22 compute-1 ceph-mgr[80080]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 09:41:22 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'snap_schedule'
Dec 06 09:41:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:22.297+0000 7fecc40b7140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 06 09:41:22 compute-1 ceph-mgr[80080]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 06 09:41:22 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'stats'
Dec 06 09:41:22 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'status'
Dec 06 09:41:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:22.469+0000 7fecc40b7140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 09:41:22 compute-1 ceph-mgr[80080]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 09:41:22 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'telegraf'
Dec 06 09:41:22 compute-1 ceph-mon[79770]: Standby manager daemon compute-2.oazbvn restarted
Dec 06 09:41:22 compute-1 ceph-mon[79770]: Standby manager daemon compute-2.oazbvn started
Dec 06 09:41:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:22.546+0000 7fecc40b7140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 09:41:22 compute-1 ceph-mgr[80080]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 09:41:22 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'telemetry'
Dec 06 09:41:22 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e37 e37: 3 total, 3 up, 3 in
Dec 06 09:41:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:22.703+0000 7fecc40b7140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 09:41:22 compute-1 ceph-mgr[80080]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 09:41:22 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'test_orchestrator'
Dec 06 09:41:22 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:41:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:22.949+0000 7fecc40b7140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 09:41:22 compute-1 ceph-mgr[80080]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 09:41:22 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'volumes'
Dec 06 09:41:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:23.241+0000 7fecc40b7140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'zabbix'
Dec 06 09:41:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:23.314+0000 7fecc40b7140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: ms_deliver_dispatch: unhandled message 0x55b29e54d860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr respawn  1: '-n'
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr respawn  2: 'mgr.compute-1.sauzid'
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr respawn  3: '-f'
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr respawn  4: '--setuser'
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr respawn  5: 'ceph'
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr respawn  6: '--setgroup'
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr respawn  7: 'ceph'
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr respawn  8: '--default-log-to-file=false'
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr respawn  9: '--default-log-to-journald=true'
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr respawn  10: '--default-log-to-stderr=false'
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr respawn  exe_path /proc/self/exe
Dec 06 09:41:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setuser ceph since I am not root
Dec 06 09:41:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setgroup ceph since I am not root
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: pidfile_write: ignore empty --pid-file
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'alerts'
Dec 06 09:41:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:23.548+0000 7f3d65c6f140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'balancer'
Dec 06 09:41:23 compute-1 ceph-mon[79770]: mgrmap e21: compute-0.qhdjwa(active, since 13s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:23 compute-1 ceph-mon[79770]: Active manager daemon compute-0.qhdjwa restarted
Dec 06 09:41:23 compute-1 ceph-mon[79770]: Activating manager daemon compute-0.qhdjwa
Dec 06 09:41:23 compute-1 ceph-mon[79770]: osdmap e37: 3 total, 3 up, 3 in
Dec 06 09:41:23 compute-1 ceph-mon[79770]: mgrmap e22: compute-0.qhdjwa(active, starting, since 0.0444664s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:23 compute-1 ceph-mon[79770]: Standby manager daemon compute-1.sauzid restarted
Dec 06 09:41:23 compute-1 ceph-mon[79770]: Standby manager daemon compute-1.sauzid started
Dec 06 09:41:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:23.633+0000 7f3d65c6f140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 09:41:23 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'cephadm'
Dec 06 09:41:24 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'crash'
Dec 06 09:41:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:24.591+0000 7f3d65c6f140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 09:41:24 compute-1 ceph-mgr[80080]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 09:41:24 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'dashboard'
Dec 06 09:41:24 compute-1 ceph-mon[79770]: mgrmap e23: compute-0.qhdjwa(active, starting, since 1.05925s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:25 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'devicehealth'
Dec 06 09:41:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:25.307+0000 7f3d65c6f140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 09:41:25 compute-1 ceph-mgr[80080]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 09:41:25 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'diskprediction_local'
Dec 06 09:41:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 06 09:41:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 06 09:41:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]:   from numpy import show_config as show_numpy_config
Dec 06 09:41:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:25.501+0000 7f3d65c6f140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 09:41:25 compute-1 ceph-mgr[80080]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 09:41:25 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'influx'
Dec 06 09:41:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:25.582+0000 7f3d65c6f140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 09:41:25 compute-1 ceph-mgr[80080]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 09:41:25 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'insights'
Dec 06 09:41:25 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'iostat'
Dec 06 09:41:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:25.736+0000 7f3d65c6f140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 09:41:25 compute-1 ceph-mgr[80080]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 09:41:25 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'k8sevents'
Dec 06 09:41:26 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'localpool'
Dec 06 09:41:26 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'mds_autoscaler'
Dec 06 09:41:26 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'mirroring'
Dec 06 09:41:26 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'nfs'
Dec 06 09:41:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:26.873+0000 7f3d65c6f140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 09:41:26 compute-1 ceph-mgr[80080]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 09:41:26 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'orchestrator'
Dec 06 09:41:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:27.114+0000 7f3d65c6f140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 09:41:27 compute-1 ceph-mgr[80080]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 09:41:27 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'osd_perf_query'
Dec 06 09:41:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:27.200+0000 7f3d65c6f140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 09:41:27 compute-1 ceph-mgr[80080]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 09:41:27 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'osd_support'
Dec 06 09:41:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:27.275+0000 7f3d65c6f140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 09:41:27 compute-1 ceph-mgr[80080]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 09:41:27 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'pg_autoscaler'
Dec 06 09:41:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:27.371+0000 7f3d65c6f140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 09:41:27 compute-1 ceph-mgr[80080]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 09:41:27 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'progress'
Dec 06 09:41:27 compute-1 systemd[1]: Stopping User Manager for UID 42477...
Dec 06 09:41:27 compute-1 systemd[72560]: Activating special unit Exit the Session...
Dec 06 09:41:27 compute-1 systemd[72560]: Stopped target Main User Target.
Dec 06 09:41:27 compute-1 systemd[72560]: Stopped target Basic System.
Dec 06 09:41:27 compute-1 systemd[72560]: Stopped target Paths.
Dec 06 09:41:27 compute-1 systemd[72560]: Stopped target Sockets.
Dec 06 09:41:27 compute-1 systemd[72560]: Stopped target Timers.
Dec 06 09:41:27 compute-1 systemd[72560]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec 06 09:41:27 compute-1 systemd[72560]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 09:41:27 compute-1 systemd[72560]: Closed D-Bus User Message Bus Socket.
Dec 06 09:41:27 compute-1 systemd[72560]: Stopped Create User's Volatile Files and Directories.
Dec 06 09:41:27 compute-1 systemd[72560]: Removed slice User Application Slice.
Dec 06 09:41:27 compute-1 systemd[72560]: Reached target Shutdown.
Dec 06 09:41:27 compute-1 systemd[72560]: Finished Exit the Session.
Dec 06 09:41:27 compute-1 systemd[72560]: Reached target Exit the Session.
Dec 06 09:41:27 compute-1 systemd[1]: user@42477.service: Deactivated successfully.
Dec 06 09:41:27 compute-1 systemd[1]: Stopped User Manager for UID 42477.
Dec 06 09:41:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:27.455+0000 7f3d65c6f140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 09:41:27 compute-1 ceph-mgr[80080]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 09:41:27 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'prometheus'
Dec 06 09:41:27 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Dec 06 09:41:27 compute-1 systemd[1]: run-user-42477.mount: Deactivated successfully.
Dec 06 09:41:27 compute-1 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Dec 06 09:41:27 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Dec 06 09:41:27 compute-1 systemd[1]: Removed slice User Slice of UID 42477.
Dec 06 09:41:27 compute-1 systemd[1]: user-42477.slice: Consumed 1min 32.940s CPU time.
Dec 06 09:41:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:27.837+0000 7f3d65c6f140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 09:41:27 compute-1 ceph-mgr[80080]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 09:41:27 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'rbd_support'
Dec 06 09:41:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e37 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:41:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:27.947+0000 7f3d65c6f140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 09:41:27 compute-1 ceph-mgr[80080]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 09:41:27 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'restful'
Dec 06 09:41:28 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'rgw'
Dec 06 09:41:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:28.417+0000 7f3d65c6f140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 09:41:28 compute-1 ceph-mgr[80080]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 09:41:28 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'rook'
Dec 06 09:41:28 compute-1 ceph-mon[79770]: Standby manager daemon compute-2.oazbvn restarted
Dec 06 09:41:28 compute-1 ceph-mon[79770]: Standby manager daemon compute-2.oazbvn started
Dec 06 09:41:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e38 e38: 3 total, 3 up, 3 in
Dec 06 09:41:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:29.087+0000 7f3d65c6f140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 09:41:29 compute-1 ceph-mgr[80080]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 09:41:29 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'selftest'
Dec 06 09:41:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:29.166+0000 7f3d65c6f140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 09:41:29 compute-1 ceph-mgr[80080]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 09:41:29 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'snap_schedule'
Dec 06 09:41:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:29.254+0000 7f3d65c6f140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 06 09:41:29 compute-1 ceph-mgr[80080]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 06 09:41:29 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'stats'
Dec 06 09:41:29 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'status'
Dec 06 09:41:29 compute-1 sshd-session[81500]: Accepted publickey for ceph-admin from 192.168.122.100 port 59718 ssh2: RSA SHA256:Gxeh0g0CuyN5zOpDUv+8o0JynyC1ASnaMny1857KGxo
Dec 06 09:41:29 compute-1 systemd[1]: Created slice User Slice of UID 42477.
Dec 06 09:41:29 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42477...
Dec 06 09:41:29 compute-1 systemd-logind[788]: New session 34 of user ceph-admin.
Dec 06 09:41:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:29.398+0000 7f3d65c6f140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 09:41:29 compute-1 ceph-mgr[80080]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 09:41:29 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'telegraf'
Dec 06 09:41:29 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42477.
Dec 06 09:41:29 compute-1 systemd[1]: Starting User Manager for UID 42477...
Dec 06 09:41:29 compute-1 systemd[81504]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:41:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:29.471+0000 7f3d65c6f140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 09:41:29 compute-1 ceph-mgr[80080]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 09:41:29 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'telemetry'
Dec 06 09:41:29 compute-1 systemd[81504]: Queued start job for default target Main User Target.
Dec 06 09:41:29 compute-1 systemd[81504]: Created slice User Application Slice.
Dec 06 09:41:29 compute-1 systemd[81504]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 06 09:41:29 compute-1 systemd[81504]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 09:41:29 compute-1 systemd[81504]: Reached target Paths.
Dec 06 09:41:29 compute-1 systemd[81504]: Reached target Timers.
Dec 06 09:41:29 compute-1 systemd[81504]: Starting D-Bus User Message Bus Socket...
Dec 06 09:41:29 compute-1 systemd[81504]: Starting Create User's Volatile Files and Directories...
Dec 06 09:41:29 compute-1 systemd[81504]: Listening on D-Bus User Message Bus Socket.
Dec 06 09:41:29 compute-1 systemd[81504]: Finished Create User's Volatile Files and Directories.
Dec 06 09:41:29 compute-1 systemd[81504]: Reached target Sockets.
Dec 06 09:41:29 compute-1 systemd[81504]: Reached target Basic System.
Dec 06 09:41:29 compute-1 systemd[81504]: Reached target Main User Target.
Dec 06 09:41:29 compute-1 systemd[81504]: Startup finished in 133ms.
Dec 06 09:41:29 compute-1 systemd[1]: Started User Manager for UID 42477.
Dec 06 09:41:29 compute-1 systemd[1]: Started Session 34 of User ceph-admin.
Dec 06 09:41:29 compute-1 sshd-session[81500]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:41:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:29.627+0000 7f3d65c6f140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 09:41:29 compute-1 ceph-mgr[80080]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 09:41:29 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'test_orchestrator'
Dec 06 09:41:29 compute-1 sudo[81520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:41:29 compute-1 sudo[81520]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:29 compute-1 sudo[81520]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:29 compute-1 sudo[81545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 06 09:41:29 compute-1 sudo[81545]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:29 compute-1 ceph-mon[79770]: mgrmap e24: compute-0.qhdjwa(active, starting, since 6s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:29 compute-1 ceph-mon[79770]: Active manager daemon compute-0.qhdjwa restarted
Dec 06 09:41:29 compute-1 ceph-mon[79770]: Activating manager daemon compute-0.qhdjwa
Dec 06 09:41:29 compute-1 ceph-mon[79770]: osdmap e38: 3 total, 3 up, 3 in
Dec 06 09:41:29 compute-1 ceph-mon[79770]: mgrmap e25: compute-0.qhdjwa(active, starting, since 0.0265623s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:29 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 06 09:41:29 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:41:29 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 06 09:41:29 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr metadata", "who": "compute-0.qhdjwa", "id": "compute-0.qhdjwa"}]: dispatch
Dec 06 09:41:29 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr metadata", "who": "compute-1.sauzid", "id": "compute-1.sauzid"}]: dispatch
Dec 06 09:41:29 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr metadata", "who": "compute-2.oazbvn", "id": "compute-2.oazbvn"}]: dispatch
Dec 06 09:41:29 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 06 09:41:29 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 06 09:41:29 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 06 09:41:29 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mds metadata"}]: dispatch
Dec 06 09:41:29 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 06 09:41:29 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata"}]: dispatch
Dec 06 09:41:29 compute-1 ceph-mon[79770]: Manager daemon compute-0.qhdjwa is now available
Dec 06 09:41:29 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/mirror_snapshot_schedule"}]: dispatch
Dec 06 09:41:29 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/trash_purge_schedule"}]: dispatch
Dec 06 09:41:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:29.856+0000 7f3d65c6f140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 09:41:29 compute-1 ceph-mgr[80080]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 09:41:29 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'volumes'
Dec 06 09:41:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e2 new map
Dec 06 09:41:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e2 print_map
                                           e2
                                           btime 2025-12-06T09:41:29:967825+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-06T09:41:29.967778+0000
                                           modified        2025-12-06T09:41:29.967778+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
Dec 06 09:41:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e39 e39: 3 total, 3 up, 3 in
Dec 06 09:41:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:30.157+0000 7f3d65c6f140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 09:41:30 compute-1 ceph-mgr[80080]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 09:41:30 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'zabbix'
Dec 06 09:41:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:41:30.242+0000 7f3d65c6f140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 09:41:30 compute-1 ceph-mgr[80080]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 09:41:30 compute-1 ceph-mgr[80080]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 09:41:30 compute-1 ceph-mgr[80080]: mgr load Constructed class from module: dashboard
Dec 06 09:41:30 compute-1 ceph-mgr[80080]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Dec 06 09:41:30 compute-1 ceph-mgr[80080]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec 06 09:41:30 compute-1 ceph-mgr[80080]: [dashboard INFO root] Starting engine...
Dec 06 09:41:30 compute-1 ceph-mgr[80080]: ms_deliver_dispatch: unhandled message 0x56550deb1860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Dec 06 09:41:30 compute-1 ceph-mgr[80080]: [dashboard INFO root] Engine started...
Dec 06 09:41:30 compute-1 podman[81652]: 2025-12-06 09:41:30.441162829 +0000 UTC m=+0.098113352 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid)
Dec 06 09:41:30 compute-1 podman[81652]: 2025-12-06 09:41:30.533959907 +0000 UTC m=+0.190910390 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Dec 06 09:41:30 compute-1 sudo[81545]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:30 compute-1 ceph-mon[79770]: mgrmap e26: compute-0.qhdjwa(active, since 1.06111s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:30 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Dec 06 09:41:30 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Dec 06 09:41:30 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Dec 06 09:41:30 compute-1 ceph-mon[79770]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Dec 06 09:41:30 compute-1 ceph-mon[79770]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Dec 06 09:41:30 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Dec 06 09:41:30 compute-1 ceph-mon[79770]: osdmap e39: 3 total, 3 up, 3 in
Dec 06 09:41:30 compute-1 ceph-mon[79770]: fsmap cephfs:0
Dec 06 09:41:30 compute-1 ceph-mon[79770]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Dec 06 09:41:30 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:30 compute-1 ceph-mon[79770]: [06/Dec/2025:09:41:30] ENGINE Bus STARTING
Dec 06 09:41:30 compute-1 ceph-mon[79770]: [06/Dec/2025:09:41:30] ENGINE Serving on https://192.168.122.100:7150
Dec 06 09:41:30 compute-1 ceph-mon[79770]: [06/Dec/2025:09:41:30] ENGINE Client ('192.168.122.100', 59318) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 09:41:30 compute-1 ceph-mon[79770]: Standby manager daemon compute-1.sauzid restarted
Dec 06 09:41:30 compute-1 ceph-mon[79770]: Standby manager daemon compute-1.sauzid started
Dec 06 09:41:30 compute-1 ceph-mon[79770]: [06/Dec/2025:09:41:30] ENGINE Serving on http://192.168.122.100:8765
Dec 06 09:41:30 compute-1 ceph-mon[79770]: [06/Dec/2025:09:41:30] ENGINE Bus STARTED
Dec 06 09:41:30 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:30 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:30 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:30 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:31 compute-1 sudo[81740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:41:31 compute-1 sudo[81740]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:31 compute-1 sudo[81740]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:31 compute-1 sudo[81765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:41:31 compute-1 sudo[81765]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:31 compute-1 sudo[81765]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:31 compute-1 sudo[81821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:41:31 compute-1 sudo[81821]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:31 compute-1 sudo[81821]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:31 compute-1 ceph-mon[79770]: pgmap v5: 131 pgs: 131 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:41:31 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:31 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:31 compute-1 ceph-mon[79770]: from='client.14424 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 09:41:31 compute-1 ceph-mon[79770]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Dec 06 09:41:31 compute-1 ceph-mon[79770]: mgrmap e27: compute-0.qhdjwa(active, since 2s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:31 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:31 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:31 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:31 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Dec 06 09:41:32 compute-1 sudo[81846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Dec 06 09:41:32 compute-1 sudo[81846]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:32 compute-1 sudo[81846]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:32 compute-1 sudo[81889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 09:41:32 compute-1 sudo[81889]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:32 compute-1 sudo[81889]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:32 compute-1 sudo[81914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph
Dec 06 09:41:32 compute-1 sudo[81914]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:32 compute-1 sudo[81914]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:32 compute-1 sudo[81939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:41:32 compute-1 sudo[81939]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:32 compute-1 sudo[81939]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:32 compute-1 sudo[81964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:41:32 compute-1 sudo[81964]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:32 compute-1 sudo[81964]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:32 compute-1 sudo[81989]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:41:32 compute-1 sudo[81989]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:32 compute-1 sudo[81989]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:41:32 compute-1 ceph-mon[79770]: Adjusting osd_memory_target on compute-2 to 127.9M
Dec 06 09:41:32 compute-1 ceph-mon[79770]: Unable to set osd_memory_target on compute-2 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Dec 06 09:41:32 compute-1 ceph-mon[79770]: from='client.14433 -' entity='client.admin' cmd=[{"prefix": "nfs cluster create", "cluster_id": "cephfs", "ingress": true, "virtual_ip": "192.168.122.2/24", "ingress_mode": "haproxy-protocol", "placement": "compute-0 compute-1 compute-2 ", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 09:41:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Dec 06 09:41:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Dec 06 09:41:32 compute-1 ceph-mon[79770]: Adjusting osd_memory_target on compute-0 to 128.0M
Dec 06 09:41:32 compute-1 ceph-mon[79770]: Unable to set osd_memory_target on compute-0 to 134217728: error parsing value: Value '134217728' is below minimum 939524096
Dec 06 09:41:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Dec 06 09:41:32 compute-1 ceph-mon[79770]: Adjusting osd_memory_target on compute-1 to 127.9M
Dec 06 09:41:32 compute-1 ceph-mon[79770]: Unable to set osd_memory_target on compute-1 to 134211993: error parsing value: Value '134211993' is below minimum 939524096
Dec 06 09:41:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:41:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:41:32 compute-1 ceph-mon[79770]: Updating compute-0:/etc/ceph/ceph.conf
Dec 06 09:41:32 compute-1 ceph-mon[79770]: Updating compute-1:/etc/ceph/ceph.conf
Dec 06 09:41:32 compute-1 ceph-mon[79770]: Updating compute-2:/etc/ceph/ceph.conf
Dec 06 09:41:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e40 e40: 3 total, 3 up, 3 in
Dec 06 09:41:33 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 40 pg[8.0( empty local-lis/les=0/0 n=0 ec=40/40 lis/c=0/0 les/c/f=0/0/0 sis=40) [0] r=0 lpr=40 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:41:33 compute-1 sudo[82037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:41:33 compute-1 sudo[82037]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:33 compute-1 sudo[82037]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:33 compute-1 sudo[82062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:41:33 compute-1 sudo[82062]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:33 compute-1 sudo[82062]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:33 compute-1 sudo[82087]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 09:41:33 compute-1 sudo[82087]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:33 compute-1 sudo[82087]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:33 compute-1 sudo[82112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config
Dec 06 09:41:33 compute-1 sudo[82112]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:33 compute-1 sudo[82112]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:33 compute-1 sudo[82137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config
Dec 06 09:41:33 compute-1 sudo[82137]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:33 compute-1 sudo[82137]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:33 compute-1 sudo[82162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:41:33 compute-1 sudo[82162]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:33 compute-1 sudo[82162]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:33 compute-1 sudo[82187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:41:33 compute-1 sudo[82187]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:33 compute-1 sudo[82187]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:33 compute-1 sudo[82212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:41:33 compute-1 sudo[82212]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:33 compute-1 sudo[82212]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:33 compute-1 sudo[82260]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:41:33 compute-1 sudo[82260]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:33 compute-1 sudo[82260]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:33 compute-1 sudo[82285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:41:33 compute-1 sudo[82285]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:33 compute-1 sudo[82285]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e41 e41: 3 total, 3 up, 3 in
Dec 06 09:41:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 41 pg[8.0( empty local-lis/les=40/41 n=0 ec=40/40 lis/c=0/0 les/c/f=0/0/0 sis=40) [0] r=0 lpr=40 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:41:34 compute-1 ceph-mon[79770]: pgmap v6: 131 pgs: 131 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:41:34 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Dec 06 09:41:34 compute-1 ceph-mon[79770]: osdmap e40: 3 total, 3 up, 3 in
Dec 06 09:41:34 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Dec 06 09:41:34 compute-1 ceph-mon[79770]: Updating compute-2:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec 06 09:41:34 compute-1 ceph-mon[79770]: Updating compute-0:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec 06 09:41:34 compute-1 ceph-mon[79770]: Updating compute-1:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec 06 09:41:34 compute-1 ceph-mon[79770]: mgrmap e28: compute-0.qhdjwa(active, since 4s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:34 compute-1 ceph-mon[79770]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec 06 09:41:34 compute-1 ceph-mon[79770]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec 06 09:41:34 compute-1 sudo[82310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec 06 09:41:34 compute-1 sudo[82310]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:34 compute-1 sudo[82310]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:34 compute-1 sudo[82335]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 09:41:34 compute-1 sudo[82335]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:34 compute-1 sudo[82335]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:34 compute-1 sudo[82360]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph
Dec 06 09:41:34 compute-1 sudo[82360]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:34 compute-1 sudo[82360]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:34 compute-1 sudo[82385]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:41:34 compute-1 sudo[82385]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:34 compute-1 sudo[82385]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:34 compute-1 sudo[82410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:41:34 compute-1 sudo[82410]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:34 compute-1 sudo[82410]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:34 compute-1 sudo[82435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:41:34 compute-1 sudo[82435]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:34 compute-1 sudo[82435]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:34 compute-1 sudo[82483]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:41:34 compute-1 sudo[82483]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:34 compute-1 sudo[82483]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:34 compute-1 sudo[82508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:41:34 compute-1 sudo[82508]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:34 compute-1 sudo[82508]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:34 compute-1 sudo[82533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 09:41:34 compute-1 sudo[82533]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:34 compute-1 sudo[82533]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:34 compute-1 sudo[82558]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config
Dec 06 09:41:34 compute-1 sudo[82558]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:34 compute-1 sudo[82558]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:34 compute-1 sudo[82583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config
Dec 06 09:41:34 compute-1 sudo[82583]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:34 compute-1 sudo[82583]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:34 compute-1 sudo[82608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring.new
Dec 06 09:41:34 compute-1 sudo[82608]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:34 compute-1 sudo[82608]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:34 compute-1 sudo[82633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:41:34 compute-1 sudo[82633]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:34 compute-1 sudo[82633]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:35 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Dec 06 09:41:35 compute-1 ceph-mon[79770]: osdmap e41: 3 total, 3 up, 3 in
Dec 06 09:41:35 compute-1 ceph-mon[79770]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec 06 09:41:35 compute-1 ceph-mon[79770]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Dec 06 09:41:35 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:35 compute-1 ceph-mon[79770]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Dec 06 09:41:35 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:35 compute-1 ceph-mon[79770]: Updating compute-2:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec 06 09:41:35 compute-1 ceph-mon[79770]: Updating compute-0:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec 06 09:41:35 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:35 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:35 compute-1 ceph-mon[79770]: Updating compute-1:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec 06 09:41:35 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:35 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:35 compute-1 sudo[82658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring.new
Dec 06 09:41:35 compute-1 sudo[82658]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:35 compute-1 sudo[82658]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:35 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e42 e42: 3 total, 3 up, 3 in
Dec 06 09:41:35 compute-1 sudo[82706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring.new
Dec 06 09:41:35 compute-1 sudo[82706]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:35 compute-1 sudo[82706]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:35 compute-1 sudo[82731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring.new
Dec 06 09:41:35 compute-1 sudo[82731]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:35 compute-1 sudo[82731]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:35 compute-1 sudo[82756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring.new /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec 06 09:41:35 compute-1 sudo[82756]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:35 compute-1 sudo[82756]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:36 compute-1 ceph-mon[79770]: pgmap v9: 132 pgs: 132 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 35 KiB/s rd, 0 B/s wr, 14 op/s
Dec 06 09:41:36 compute-1 ceph-mon[79770]: osdmap e42: 3 total, 3 up, 3 in
Dec 06 09:41:36 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:36 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:36 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:36 compute-1 ceph-mon[79770]: Deploying daemon node-exporter.compute-0 on compute-0
Dec 06 09:41:36 compute-1 ceph-mon[79770]: mgrmap e29: compute-0.qhdjwa(active, since 6s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:41:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/351927990' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Dec 06 09:41:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/351927990' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Dec 06 09:41:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:41:38 compute-1 sudo[82781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:41:38 compute-1 sudo[82781]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:38 compute-1 sudo[82781]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:38 compute-1 sudo[82806]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/prometheus/node-exporter:v1.7.0 --timeout 895 _orch deploy --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:41:38 compute-1 sudo[82806]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:38 compute-1 ceph-mon[79770]: pgmap v11: 132 pgs: 132 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 0 B/s wr, 12 op/s
Dec 06 09:41:38 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:38 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:38 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/824556430' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec 06 09:41:38 compute-1 systemd[1]: Reloading.
Dec 06 09:41:39 compute-1 systemd-rc-local-generator[82899]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:39 compute-1 systemd-sysv-generator[82903]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:39 compute-1 systemd[1]: Reloading.
Dec 06 09:41:39 compute-1 systemd-rc-local-generator[82939]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:39 compute-1 systemd-sysv-generator[82943]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:39 compute-1 systemd[1]: Starting Ceph node-exporter.compute-1 for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:41:39 compute-1 ceph-mon[79770]: Deploying daemon node-exporter.compute-1 on compute-1
Dec 06 09:41:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/917045225' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 09:41:39 compute-1 bash[82997]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Dec 06 09:41:40 compute-1 bash[82997]: Getting image source signatures
Dec 06 09:41:40 compute-1 bash[82997]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Dec 06 09:41:40 compute-1 bash[82997]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Dec 06 09:41:40 compute-1 bash[82997]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Dec 06 09:41:40 compute-1 ceph-mon[79770]: pgmap v12: 132 pgs: 132 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 0 B/s wr, 12 op/s
Dec 06 09:41:40 compute-1 bash[82997]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Dec 06 09:41:40 compute-1 bash[82997]: Writing manifest to image destination
Dec 06 09:41:40 compute-1 podman[82997]: 2025-12-06 09:41:40.840037377 +0000 UTC m=+1.078055011 container create 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:41:40 compute-1 podman[82997]: 2025-12-06 09:41:40.81834396 +0000 UTC m=+1.056361634 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Dec 06 09:41:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/23e7edaacd69ae45d5bad71eeb3f011b2043921644e9cc36e86eee43df0ce8ca/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Dec 06 09:41:40 compute-1 podman[82997]: 2025-12-06 09:41:40.905353682 +0000 UTC m=+1.143371376 container init 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:41:40 compute-1 podman[82997]: 2025-12-06 09:41:40.910650006 +0000 UTC m=+1.148667640 container start 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:41:40 compute-1 bash[82997]: 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.919Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.919Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.920Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.920Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.920Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.920Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.920Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=arp
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=bcache
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=bonding
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=cpu
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=dmi
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=edac
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=entropy
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=filefd
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=hwmon
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=netclass
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=netdev
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=netstat
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=nfs
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=nvme
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=os
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=pressure
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=rapl
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=selinux
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=softnet
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=stat
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=textfile
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=thermal_zone
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=time
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=uname
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.921Z caller=node_exporter.go:117 level=info collector=xfs
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.922Z caller=node_exporter.go:117 level=info collector=zfs
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.923Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Dec 06 09:41:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1[83073]: ts=2025-12-06T09:41:40.923Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Dec 06 09:41:40 compute-1 systemd[1]: Started Ceph node-exporter.compute-1 for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:41:40 compute-1 sudo[82806]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:41 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1032166629' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Dec 06 09:41:41 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:41 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:41 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:41:43 compute-1 ceph-mon[79770]: pgmap v13: 132 pgs: 132 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 0 B/s wr, 9 op/s
Dec 06 09:41:43 compute-1 ceph-mon[79770]: Deploying daemon node-exporter.compute-2 on compute-2
Dec 06 09:41:44 compute-1 ceph-mon[79770]: pgmap v14: 132 pgs: 132 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 0 B/s wr, 8 op/s
Dec 06 09:41:44 compute-1 ceph-mon[79770]: from='client.14466 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 09:41:44 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:44 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:44 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:44 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:44 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:41:44 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:41:44 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:41:44 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:46 compute-1 ceph-mon[79770]: pgmap v15: 132 pgs: 132 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:41:46 compute-1 ceph-mon[79770]: from='client.14472 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 09:41:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:41:48 compute-1 ceph-mon[79770]: pgmap v16: 132 pgs: 132 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:41:48 compute-1 ceph-mon[79770]: from='client.14478 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 09:41:49 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:49 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:49 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.qizhkr", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec 06 09:41:49 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.qizhkr", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec 06 09:41:49 compute-1 ceph-mon[79770]: pgmap v17: 132 pgs: 132 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:41:49 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:49 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:41:49 compute-1 ceph-mon[79770]: Deploying daemon rgw.rgw.compute-2.qizhkr on compute-2
Dec 06 09:41:49 compute-1 ceph-mon[79770]: from='client.14484 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Dec 06 09:41:50 compute-1 sudo[83082]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:41:50 compute-1 sudo[83082]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:50 compute-1 sudo[83082]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:50 compute-1 sudo[83107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:41:50 compute-1 sudo[83107]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:51 compute-1 podman[83172]: 2025-12-06 09:41:51.363445065 +0000 UTC m=+0.053251775 container create 40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_darwin, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:41:51 compute-1 systemd[1]: Started libpod-conmon-40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472.scope.
Dec 06 09:41:51 compute-1 podman[83172]: 2025-12-06 09:41:51.333032744 +0000 UTC m=+0.022839494 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:41:51 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:41:51 compute-1 podman[83172]: 2025-12-06 09:41:51.462116468 +0000 UTC m=+0.151923238 container init 40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_darwin, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 06 09:41:51 compute-1 podman[83172]: 2025-12-06 09:41:51.469406279 +0000 UTC m=+0.159213059 container start 40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_darwin, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Dec 06 09:41:51 compute-1 podman[83172]: 2025-12-06 09:41:51.473772211 +0000 UTC m=+0.163578961 container attach 40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_darwin, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1)
Dec 06 09:41:51 compute-1 cool_darwin[83188]: 167 167
Dec 06 09:41:51 compute-1 systemd[1]: libpod-40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472.scope: Deactivated successfully.
Dec 06 09:41:51 compute-1 podman[83172]: 2025-12-06 09:41:51.479049804 +0000 UTC m=+0.168856534 container died 40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_darwin, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 06 09:41:51 compute-1 systemd[1]: var-lib-containers-storage-overlay-86937cdc766307aff207e76b35a8eca25302da0280b0efceb85d37c406cb5ca1-merged.mount: Deactivated successfully.
Dec 06 09:41:51 compute-1 podman[83172]: 2025-12-06 09:41:51.539719901 +0000 UTC m=+0.229526631 container remove 40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=cool_darwin, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:41:51 compute-1 systemd[1]: libpod-conmon-40d29fec1885b4f15c1ecf2afe342b1c1845c682e51d5d7f40a18234b69c9472.scope: Deactivated successfully.
Dec 06 09:41:51 compute-1 systemd[1]: Reloading.
Dec 06 09:41:51 compute-1 systemd-rc-local-generator[83232]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:51 compute-1 systemd-sysv-generator[83237]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:51 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e43 e43: 3 total, 3 up, 3 in
Dec 06 09:41:51 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 43 pg[9.0( empty local-lis/les=0/0 n=0 ec=43/43 lis/c=0/0 les/c/f=0/0/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:41:51 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:51 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:51 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:51 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.oqhsdh", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec 06 09:41:51 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.oqhsdh", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec 06 09:41:51 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:51 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:41:51 compute-1 ceph-mon[79770]: Deploying daemon rgw.rgw.compute-1.oqhsdh on compute-1
Dec 06 09:41:51 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2506900584' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Dec 06 09:41:51 compute-1 systemd[1]: Reloading.
Dec 06 09:41:51 compute-1 systemd-rc-local-generator[83276]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:51 compute-1 systemd-sysv-generator[83279]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:52 compute-1 systemd[1]: Starting Ceph rgw.rgw.compute-1.oqhsdh for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:41:52 compute-1 podman[83334]: 2025-12-06 09:41:52.4117885 +0000 UTC m=+0.047707115 container create 99c2e5d092334c4a30122097374eccc65942e47288d38585ee146c9055718bdf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-rgw-rgw-compute-1-oqhsdh, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 09:41:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/395746ebd40524aacce127092e2096cbba77a7b0eb9433a716457938c170aeba/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:41:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/395746ebd40524aacce127092e2096cbba77a7b0eb9433a716457938c170aeba/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:41:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/395746ebd40524aacce127092e2096cbba77a7b0eb9433a716457938c170aeba/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 09:41:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/395746ebd40524aacce127092e2096cbba77a7b0eb9433a716457938c170aeba/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-1.oqhsdh supports timestamps until 2038 (0x7fffffff)
Dec 06 09:41:52 compute-1 podman[83334]: 2025-12-06 09:41:52.387223006 +0000 UTC m=+0.023141641 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:41:52 compute-1 podman[83334]: 2025-12-06 09:41:52.489643389 +0000 UTC m=+0.125562054 container init 99c2e5d092334c4a30122097374eccc65942e47288d38585ee146c9055718bdf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-rgw-rgw-compute-1-oqhsdh, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 09:41:52 compute-1 podman[83334]: 2025-12-06 09:41:52.494690957 +0000 UTC m=+0.130609592 container start 99c2e5d092334c4a30122097374eccc65942e47288d38585ee146c9055718bdf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-rgw-rgw-compute-1-oqhsdh, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Dec 06 09:41:52 compute-1 bash[83334]: 99c2e5d092334c4a30122097374eccc65942e47288d38585ee146c9055718bdf
Dec 06 09:41:52 compute-1 systemd[1]: Started Ceph rgw.rgw.compute-1.oqhsdh for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:41:52 compute-1 sudo[83107]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:52 compute-1 radosgw[83354]: deferred set uid:gid to 167:167 (ceph:ceph)
Dec 06 09:41:52 compute-1 radosgw[83354]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Dec 06 09:41:52 compute-1 radosgw[83354]: framework: beast
Dec 06 09:41:52 compute-1 radosgw[83354]: framework conf key: endpoint, val: 192.168.122.101:8082
Dec 06 09:41:52 compute-1 radosgw[83354]: init_numa not setting numa affinity
Dec 06 09:41:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e44 e44: 3 total, 3 up, 3 in
Dec 06 09:41:52 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 44 pg[9.0( empty local-lis/les=43/44 n=0 ec=43/43 lis/c=0/0 les/c/f=0/0/0 sis=43) [0] r=0 lpr=43 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:41:52 compute-1 ceph-mon[79770]: pgmap v18: 132 pgs: 132 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:41:52 compute-1 ceph-mon[79770]: osdmap e43: 3 total, 3 up, 3 in
Dec 06 09:41:52 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Dec 06 09:41:52 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3027759423' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Dec 06 09:41:52 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:52 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:52 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:52 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.zktslo", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec 06 09:41:52 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.zktslo", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec 06 09:41:52 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:52 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:41:52 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Dec 06 09:41:52 compute-1 ceph-mon[79770]: osdmap e44: 3 total, 3 up, 3 in
Dec 06 09:41:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:41:53 compute-1 ceph-mon[79770]: Deploying daemon rgw.rgw.compute-0.zktslo on compute-0
Dec 06 09:41:53 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1220877648' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Dec 06 09:41:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e45 e45: 3 total, 3 up, 3 in
Dec 06 09:41:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Dec 06 09:41:53 compute-1 ceph-mon[79770]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec 06 09:41:54 compute-1 ceph-mon[79770]: pgmap v21: 133 pgs: 1 unknown, 132 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:41:54 compute-1 ceph-mon[79770]: osdmap e45: 3 total, 3 up, 3 in
Dec 06 09:41:54 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec 06 09:41:54 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec 06 09:41:54 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec 06 09:41:54 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Dec 06 09:41:54 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/702722184' entity='client.admin' cmd=[{"prefix": "osd get-require-min-compat-client"}]: dispatch
Dec 06 09:41:54 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:54 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:54 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:54 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:54 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:54 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.czucwy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec 06 09:41:54 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.czucwy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec 06 09:41:54 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:41:54 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e46 e46: 3 total, 3 up, 3 in
Dec 06 09:41:55 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e47 e47: 3 total, 3 up, 3 in
Dec 06 09:41:55 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 47 pg[11.0( empty local-lis/les=0/0 n=0 ec=47/47 lis/c=0/0 les/c/f=0/0/0 sis=47) [0] r=0 lpr=47 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:41:55 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0)
Dec 06 09:41:55 compute-1 ceph-mon[79770]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec 06 09:41:55 compute-1 ceph-mon[79770]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Dec 06 09:41:55 compute-1 ceph-mon[79770]: Deploying daemon mds.cephfs.compute-2.czucwy on compute-2
Dec 06 09:41:55 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Dec 06 09:41:55 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Dec 06 09:41:55 compute-1 ceph-mon[79770]: osdmap e46: 3 total, 3 up, 3 in
Dec 06 09:41:56 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e48 e48: 3 total, 3 up, 3 in
Dec 06 09:41:56 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 48 pg[11.0( empty local-lis/les=47/48 n=0 ec=47/47 lis/c=0/0 les/c/f=0/0/0 sis=47) [0] r=0 lpr=47 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:41:56 compute-1 ceph-mon[79770]: pgmap v24: 134 pgs: 134 active+clean; 450 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 7.0 KiB/s rd, 2.0 KiB/s wr, 11 op/s
Dec 06 09:41:56 compute-1 ceph-mon[79770]: osdmap e47: 3 total, 3 up, 3 in
Dec 06 09:41:56 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec 06 09:41:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec 06 09:41:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec 06 09:41:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec 06 09:41:56 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Dec 06 09:41:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/607080093' entity='client.admin' cmd=[{"prefix": "versions", "format": "json"}]: dispatch
Dec 06 09:41:56 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:56 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:56 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:56 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.ujokui", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec 06 09:41:56 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.ujokui", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec 06 09:41:56 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:41:56 compute-1 ceph-mon[79770]: Deploying daemon mds.cephfs.compute-0.ujokui on compute-0
Dec 06 09:41:56 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Dec 06 09:41:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Dec 06 09:41:56 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Dec 06 09:41:56 compute-1 ceph-mon[79770]: osdmap e48: 3 total, 3 up, 3 in
Dec 06 09:41:56 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e3 new map
Dec 06 09:41:56 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e3 print_map
                                           e3
                                           btime 2025-12-06T09:41:56:804272+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-06T09:41:29.967778+0000
                                           modified        2025-12-06T09:41:29.967778+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-2.czucwy{-1:24274} state up:standby seq 1 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]
Dec 06 09:41:56 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e4 new map
Dec 06 09:41:56 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e4 print_map
                                           e4
                                           btime 2025-12-06T09:41:56:835698+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        4
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-06T09:41:29.967778+0000
                                           modified        2025-12-06T09:41:56.835690+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24274}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                           [mds.cephfs.compute-2.czucwy{0:24274} state up:creating seq 1 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Dec 06 09:41:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e49 e49: 3 total, 3 up, 3 in
Dec 06 09:41:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Dec 06 09:41:57 compute-1 ceph-mon[79770]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec 06 09:41:57 compute-1 ceph-mon[79770]: mds.? [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] up:boot
Dec 06 09:41:57 compute-1 ceph-mon[79770]: daemon mds.cephfs.compute-2.czucwy assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Dec 06 09:41:57 compute-1 ceph-mon[79770]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Dec 06 09:41:57 compute-1 ceph-mon[79770]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Dec 06 09:41:57 compute-1 ceph-mon[79770]: Cluster is now healthy
Dec 06 09:41:57 compute-1 ceph-mon[79770]: fsmap cephfs:0 1 up:standby
Dec 06 09:41:57 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.czucwy"}]: dispatch
Dec 06 09:41:57 compute-1 ceph-mon[79770]: fsmap cephfs:1 {0=cephfs.compute-2.czucwy=up:creating}
Dec 06 09:41:57 compute-1 ceph-mon[79770]: daemon mds.cephfs.compute-2.czucwy is now active in filesystem cephfs as rank 0
Dec 06 09:41:57 compute-1 ceph-mon[79770]: pgmap v27: 135 pgs: 1 unknown, 134 active+clean; 450 KiB data, 80 MiB used, 60 GiB / 60 GiB avail; 7.0 KiB/s rd, 2.0 KiB/s wr, 11 op/s
Dec 06 09:41:57 compute-1 ceph-mon[79770]: osdmap e49: 3 total, 3 up, 3 in
Dec 06 09:41:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e49 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:41:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e5 new map
Dec 06 09:41:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e5 print_map
                                           e5
                                           btime 2025-12-06T09:41:57:856282+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-06T09:41:29.967778+0000
                                           modified        2025-12-06T09:41:57.856277+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24274}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 24274 members: 24274
                                           [mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 2 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Dec 06 09:41:58 compute-1 sudo[83942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:41:58 compute-1 sudo[83942]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:58 compute-1 sudo[83942]: pam_unix(sudo:session): session closed for user root
Dec 06 09:41:58 compute-1 sudo[83967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:41:58 compute-1 sudo[83967]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:41:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e50 e50: 3 total, 3 up, 3 in
Dec 06 09:41:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Dec 06 09:41:58 compute-1 ceph-mon[79770]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec 06 09:41:58 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec 06 09:41:58 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec 06 09:41:58 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec 06 09:41:58 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec 06 09:41:58 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Dec 06 09:41:58 compute-1 ceph-mon[79770]: mds.? [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] up:active
Dec 06 09:41:58 compute-1 ceph-mon[79770]: fsmap cephfs:1 {0=cephfs.compute-2.czucwy=up:active}
Dec 06 09:41:58 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:58 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:58 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:58 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.fpvjgb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Dec 06 09:41:58 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.fpvjgb", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Dec 06 09:41:58 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:41:58 compute-1 ceph-mon[79770]: Deploying daemon mds.cephfs.compute-1.fpvjgb on compute-1
Dec 06 09:41:58 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec 06 09:41:58 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec 06 09:41:58 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Dec 06 09:41:58 compute-1 ceph-mon[79770]: osdmap e50: 3 total, 3 up, 3 in
Dec 06 09:41:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e6 new map
Dec 06 09:41:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e6 print_map
                                           e6
                                           btime 2025-12-06T09:41:58:872230+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-06T09:41:29.967778+0000
                                           modified        2025-12-06T09:41:57.856277+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24274}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 24274 members: 24274
                                           [mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 2 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.ujokui{-1:14544} state up:standby seq 1 addr [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] compat {c=[1],r=[1],i=[1fff]}]
Dec 06 09:41:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e7 new map
Dec 06 09:41:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e7 print_map
                                           e7
                                           btime 2025-12-06T09:41:58:889029+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-06T09:41:29.967778+0000
                                           modified        2025-12-06T09:41:57.856277+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24274}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24274 members: 24274
                                           [mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 2 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.ujokui{-1:14544} state up:standby seq 1 addr [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] compat {c=[1],r=[1],i=[1fff]}]
Dec 06 09:41:59 compute-1 podman[84032]: 2025-12-06 09:41:59.136640873 +0000 UTC m=+0.059561972 container create b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_nash, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Dec 06 09:41:59 compute-1 systemd[1]: Started libpod-conmon-b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616.scope.
Dec 06 09:41:59 compute-1 podman[84032]: 2025-12-06 09:41:59.106892878 +0000 UTC m=+0.029814017 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:41:59 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:41:59 compute-1 podman[84032]: 2025-12-06 09:41:59.239815633 +0000 UTC m=+0.162736762 container init b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_nash, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:41:59 compute-1 podman[84032]: 2025-12-06 09:41:59.253012351 +0000 UTC m=+0.175933450 container start b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_nash, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 06 09:41:59 compute-1 podman[84032]: 2025-12-06 09:41:59.256922843 +0000 UTC m=+0.179843942 container attach b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_nash, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:41:59 compute-1 magical_nash[84048]: 167 167
Dec 06 09:41:59 compute-1 systemd[1]: libpod-b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616.scope: Deactivated successfully.
Dec 06 09:41:59 compute-1 podman[84032]: 2025-12-06 09:41:59.260822954 +0000 UTC m=+0.183744053 container died b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_nash, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325)
Dec 06 09:41:59 compute-1 systemd[1]: var-lib-containers-storage-overlay-13bd6873fe53a8616bb0eae094bc00bb67d033bb8f1e78b02206f256dbcfd3e5-merged.mount: Deactivated successfully.
Dec 06 09:41:59 compute-1 podman[84032]: 2025-12-06 09:41:59.316476164 +0000 UTC m=+0.239397283 container remove b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=magical_nash, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:41:59 compute-1 systemd[1]: libpod-conmon-b38516c0d5f3f2df479ab73c1fd28dc8b20c7fa3feacf446ae058e7b664cd616.scope: Deactivated successfully.
Dec 06 09:41:59 compute-1 systemd[1]: Reloading.
Dec 06 09:41:59 compute-1 systemd-rc-local-generator[84090]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:59 compute-1 systemd-sysv-generator[84095]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:59 compute-1 systemd[1]: Reloading.
Dec 06 09:41:59 compute-1 systemd-rc-local-generator[84130]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:41:59 compute-1 systemd-sysv-generator[84134]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:41:59 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e51 e51: 3 total, 3 up, 3 in
Dec 06 09:41:59 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec 06 09:41:59 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec 06 09:41:59 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/4120731466' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec 06 09:41:59 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/827372016' entity='client.rgw.rgw.compute-2.qizhkr' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec 06 09:41:59 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Dec 06 09:41:59 compute-1 ceph-mon[79770]: mds.? [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] up:boot
Dec 06 09:41:59 compute-1 ceph-mon[79770]: fsmap cephfs:1 {0=cephfs.compute-2.czucwy=up:active} 1 up:standby
Dec 06 09:41:59 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.ujokui"}]: dispatch
Dec 06 09:41:59 compute-1 ceph-mon[79770]: fsmap cephfs:1 {0=cephfs.compute-2.czucwy=up:active} 1 up:standby
Dec 06 09:41:59 compute-1 ceph-mon[79770]: pgmap v30: 136 pgs: 2 unknown, 134 active+clean; 450 KiB data, 80 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:41:59 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:41:59 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1940551259' entity='client.rgw.rgw.compute-0.zktslo' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec 06 09:41:59 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-2.qizhkr' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec 06 09:41:59 compute-1 ceph-mon[79770]: from='client.? ' entity='client.rgw.rgw.compute-1.oqhsdh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Dec 06 09:41:59 compute-1 ceph-mon[79770]: osdmap e51: 3 total, 3 up, 3 in
Dec 06 09:41:59 compute-1 systemd[1]: Starting Ceph mds.cephfs.compute-1.fpvjgb for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:42:00 compute-1 radosgw[83354]: v1 topic migration: starting v1 topic migration..
Dec 06 09:42:00 compute-1 radosgw[83354]: LDAP not started since no server URIs were provided in the configuration.
Dec 06 09:42:00 compute-1 radosgw[83354]: v1 topic migration: finished v1 topic migration
Dec 06 09:42:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-rgw-rgw-compute-1-oqhsdh[83350]: 2025-12-06T09:42:00.169+0000 7f396e20b980 -1 LDAP not started since no server URIs were provided in the configuration.
Dec 06 09:42:00 compute-1 radosgw[83354]: framework: beast
Dec 06 09:42:00 compute-1 radosgw[83354]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Dec 06 09:42:00 compute-1 radosgw[83354]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Dec 06 09:42:00 compute-1 radosgw[83354]: starting handler: beast
Dec 06 09:42:00 compute-1 radosgw[83354]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 09:42:00 compute-1 radosgw[83354]: mgrc service_daemon_register rgw.24197 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.101:8082,frontend_type#0=beast,hostname=compute-1,id=rgw.compute-1.oqhsdh,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025,kernel_version=5.14.0-645.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864312,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=d81f60a3-cfd4-40b3-a809-ad3aae1b1fd0,zone_name=default,zonegroup_id=75773215-ab74-4afd-a4c0-f777a01e4a1a,zonegroup_name=default}
Dec 06 09:42:00 compute-1 podman[84221]: 2025-12-06 09:42:00.558206777 +0000 UTC m=+0.044856318 container create 17b03b2bf6f0162831451ffcdd012066b2bc55aad88c494025779ae0bd1c353e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mds-cephfs-compute-1-fpvjgb, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:42:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aea48407195da003f757e5ee98f66cd59111179655f3523b76cbd279f1cde646/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 09:42:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aea48407195da003f757e5ee98f66cd59111179655f3523b76cbd279f1cde646/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:42:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aea48407195da003f757e5ee98f66cd59111179655f3523b76cbd279f1cde646/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 09:42:00 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aea48407195da003f757e5ee98f66cd59111179655f3523b76cbd279f1cde646/merged/var/lib/ceph/mds/ceph-cephfs.compute-1.fpvjgb supports timestamps until 2038 (0x7fffffff)
Dec 06 09:42:00 compute-1 podman[84221]: 2025-12-06 09:42:00.622881928 +0000 UTC m=+0.109531489 container init 17b03b2bf6f0162831451ffcdd012066b2bc55aad88c494025779ae0bd1c353e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mds-cephfs-compute-1-fpvjgb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=squid)
Dec 06 09:42:00 compute-1 podman[84221]: 2025-12-06 09:42:00.629245737 +0000 UTC m=+0.115895278 container start 17b03b2bf6f0162831451ffcdd012066b2bc55aad88c494025779ae0bd1c353e (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mds-cephfs-compute-1-fpvjgb, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Dec 06 09:42:00 compute-1 bash[84221]: 17b03b2bf6f0162831451ffcdd012066b2bc55aad88c494025779ae0bd1c353e
Dec 06 09:42:00 compute-1 podman[84221]: 2025-12-06 09:42:00.538114138 +0000 UTC m=+0.024763709 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:42:00 compute-1 systemd[1]: Started Ceph mds.cephfs.compute-1.fpvjgb for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:42:00 compute-1 ceph-mds[84241]: set uid:gid to 167:167 (ceph:ceph)
Dec 06 09:42:00 compute-1 ceph-mds[84241]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Dec 06 09:42:00 compute-1 ceph-mds[84241]: main not setting numa affinity
Dec 06 09:42:00 compute-1 ceph-mds[84241]: pidfile_write: ignore empty --pid-file
Dec 06 09:42:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mds-cephfs-compute-1-fpvjgb[84237]: starting mds.cephfs.compute-1.fpvjgb at 
Dec 06 09:42:00 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Updating MDS map to version 7 from mon.2
Dec 06 09:42:00 compute-1 sudo[83967]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:00 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e8 new map
Dec 06 09:42:00 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e8 print_map
                                           e8
                                           btime 2025-12-06T09:42:00:908587+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        8
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-06T09:41:29.967778+0000
                                           modified        2025-12-06T09:42:00.880325+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24274}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24274 members: 24274
                                           [mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.ujokui{-1:14544} state up:standby seq 1 addr [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.fpvjgb{-1:24215} state up:standby seq 1 addr [v2:192.168.122.101:6804/2619956440,v1:192.168.122.101:6805/2619956440] compat {c=[1],r=[1],i=[1fff]}]
Dec 06 09:42:00 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Updating MDS map to version 8 from mon.2
Dec 06 09:42:00 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Monitors have assigned me to become a standby
Dec 06 09:42:01 compute-1 sudo[84261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:42:01 compute-1 sudo[84261]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:42:01 compute-1 sudo[84261]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:01 compute-1 sudo[84286]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:42:01 compute-1 sudo[84286]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.djsnbu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.djsnbu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:42:01 compute-1 ceph-mon[79770]: mds.? [v2:192.168.122.101:6804/2619956440,v1:192.168.122.101:6805/2619956440] up:boot
Dec 06 09:42:01 compute-1 ceph-mon[79770]: mds.? [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] up:active
Dec 06 09:42:01 compute-1 ceph-mon[79770]: fsmap cephfs:1 {0=cephfs.compute-2.czucwy=up:active} 2 up:standby
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.fpvjgb"}]: dispatch
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.djsnbu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.djsnbu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec 06 09:42:01 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:42:01 compute-1 podman[84349]: 2025-12-06 09:42:01.769710195 +0000 UTC m=+0.055684152 container create 16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_goldwasser, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 09:42:01 compute-1 systemd[1]: Started libpod-conmon-16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2.scope.
Dec 06 09:42:01 compute-1 podman[84349]: 2025-12-06 09:42:01.744770283 +0000 UTC m=+0.030744270 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:42:01 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:42:01 compute-1 podman[84349]: 2025-12-06 09:42:01.87782082 +0000 UTC m=+0.163794797 container init 16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_goldwasser, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:42:01 compute-1 podman[84349]: 2025-12-06 09:42:01.885806087 +0000 UTC m=+0.171780044 container start 16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_goldwasser, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0)
Dec 06 09:42:01 compute-1 podman[84349]: 2025-12-06 09:42:01.889062813 +0000 UTC m=+0.175036770 container attach 16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_goldwasser, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 09:42:01 compute-1 systemd[1]: libpod-16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2.scope: Deactivated successfully.
Dec 06 09:42:01 compute-1 zen_goldwasser[84366]: 167 167
Dec 06 09:42:01 compute-1 conmon[84366]: conmon 16e407b4e4845d7f6c53 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2.scope/container/memory.events
Dec 06 09:42:01 compute-1 podman[84349]: 2025-12-06 09:42:01.894830438 +0000 UTC m=+0.180804395 container died 16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_goldwasser, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Dec 06 09:42:01 compute-1 systemd[1]: var-lib-containers-storage-overlay-9f8c89d08f4caf7f1c2d0bba3afec1deda6f360e555a94df8ec3a28965ea630b-merged.mount: Deactivated successfully.
Dec 06 09:42:01 compute-1 podman[84349]: 2025-12-06 09:42:01.94032549 +0000 UTC m=+0.226299457 container remove 16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_goldwasser, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 09:42:01 compute-1 systemd[1]: libpod-conmon-16e407b4e4845d7f6c5384028459f391a49c74c6f25c684ff10a6cc61cfd24b2.scope: Deactivated successfully.
Dec 06 09:42:01 compute-1 systemd[1]: Reloading.
Dec 06 09:42:02 compute-1 systemd-sysv-generator[84414]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:42:02 compute-1 systemd-rc-local-generator[84405]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:42:02 compute-1 systemd[1]: Reloading.
Dec 06 09:42:02 compute-1 systemd-rc-local-generator[84453]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:42:02 compute-1 systemd-sysv-generator[84457]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:42:02 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:42:02 compute-1 ceph-mon[79770]: Creating key for client.nfs.cephfs.0.0.compute-1.djsnbu
Dec 06 09:42:02 compute-1 ceph-mon[79770]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Dec 06 09:42:02 compute-1 ceph-mon[79770]: pgmap v32: 136 pgs: 136 active+clean; 454 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 7.4 KiB/s wr, 29 op/s
Dec 06 09:42:02 compute-1 ceph-mon[79770]: Rados config object exists: conf-nfs.cephfs
Dec 06 09:42:02 compute-1 ceph-mon[79770]: Creating key for client.nfs.cephfs.0.0.compute-1.djsnbu-rgw
Dec 06 09:42:02 compute-1 ceph-mon[79770]: Bind address in nfs.cephfs.0.0.compute-1.djsnbu's ganesha conf is defaulting to empty
Dec 06 09:42:02 compute-1 ceph-mon[79770]: Deploying daemon nfs.cephfs.0.0.compute-1.djsnbu on compute-1
Dec 06 09:42:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:42:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e9 new map
Dec 06 09:42:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e9 print_map
                                           e9
                                           btime 2025-12-06T09:42:02:933823+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        8
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-06T09:41:29.967778+0000
                                           modified        2025-12-06T09:42:00.880325+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24274}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24274 members: 24274
                                           [mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.ujokui{-1:14544} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.fpvjgb{-1:24215} state up:standby seq 1 addr [v2:192.168.122.101:6804/2619956440,v1:192.168.122.101:6805/2619956440] compat {c=[1],r=[1],i=[1fff]}]
Dec 06 09:42:03 compute-1 podman[84507]: 2025-12-06 09:42:03.000762058 +0000 UTC m=+0.057955535 container create 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:42:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd7e7e9e4ddac3e74b3b7bc6b20dd5bb2fcc490030f679e68f53a0a8ada38ac6/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 06 09:42:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd7e7e9e4ddac3e74b3b7bc6b20dd5bb2fcc490030f679e68f53a0a8ada38ac6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:42:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd7e7e9e4ddac3e74b3b7bc6b20dd5bb2fcc490030f679e68f53a0a8ada38ac6/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:42:03 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd7e7e9e4ddac3e74b3b7bc6b20dd5bb2fcc490030f679e68f53a0a8ada38ac6/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:42:03 compute-1 podman[84507]: 2025-12-06 09:42:02.977326091 +0000 UTC m=+0.034519568 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:42:03 compute-1 podman[84507]: 2025-12-06 09:42:03.072685202 +0000 UTC m=+0.129878719 container init 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 06 09:42:03 compute-1 podman[84507]: 2025-12-06 09:42:03.08177002 +0000 UTC m=+0.138963497 container start 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 06 09:42:03 compute-1 bash[84507]: 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4
Dec 06 09:42:03 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:42:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 06 09:42:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 06 09:42:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 06 09:42:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 06 09:42:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 06 09:42:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 06 09:42:03 compute-1 sudo[84286]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 06 09:42:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:42:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec 06 09:42:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec 06 09:42:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:42:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:42:03 compute-1 ceph-mon[79770]: mds.? [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] up:standby
Dec 06 09:42:03 compute-1 ceph-mon[79770]: fsmap cephfs:1 {0=cephfs.compute-2.czucwy=up:active} 2 up:standby
Dec 06 09:42:03 compute-1 ceph-mon[79770]: pgmap v33: 136 pgs: 136 active+clean; 454 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 5.2 KiB/s wr, 20 op/s
Dec 06 09:42:03 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:03 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:03 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:03 compute-1 ceph-mon[79770]: Creating key for client.nfs.cephfs.1.0.compute-2.sseuqb
Dec 06 09:42:03 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.sseuqb", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec 06 09:42:03 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.sseuqb", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec 06 09:42:03 compute-1 ceph-mon[79770]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Dec 06 09:42:03 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec 06 09:42:03 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec 06 09:42:03 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:42:05 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:05 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e10 new map
Dec 06 09:42:05 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).mds e10 print_map
                                           e10
                                           btime 2025-12-06T09:42:05:044345+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        8
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2025-12-06T09:41:29.967778+0000
                                           modified        2025-12-06T09:42:00.880325+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24274}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24274 members: 24274
                                           [mds.cephfs.compute-2.czucwy{0:24274} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/1500676117,v1:192.168.122.102:6805/1500676117] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.ujokui{-1:14544} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/2465826838,v1:192.168.122.100:6807/2465826838] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.fpvjgb{-1:24215} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/2619956440,v1:192.168.122.101:6805/2619956440] compat {c=[1],r=[1],i=[1fff]}]
Dec 06 09:42:05 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Updating MDS map to version 10 from mon.2
Dec 06 09:42:06 compute-1 ceph-mon[79770]: pgmap v34: 136 pgs: 136 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 183 KiB/s rd, 8.0 KiB/s wr, 343 op/s
Dec 06 09:42:06 compute-1 ceph-mon[79770]: mds.? [v2:192.168.122.101:6804/2619956440,v1:192.168.122.101:6805/2619956440] up:standby
Dec 06 09:42:06 compute-1 ceph-mon[79770]: fsmap cephfs:1 {0=cephfs.compute-2.czucwy=up:active} 2 up:standby
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000001:nfs.cephfs.0: -2
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 06 09:42:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 09:42:07 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec 06 09:42:07 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec 06 09:42:07 compute-1 ceph-mon[79770]: Rados config object exists: conf-nfs.cephfs
Dec 06 09:42:07 compute-1 ceph-mon[79770]: Creating key for client.nfs.cephfs.1.0.compute-2.sseuqb-rgw
Dec 06 09:42:07 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.sseuqb-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec 06 09:42:07 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.sseuqb-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec 06 09:42:07 compute-1 ceph-mon[79770]: Bind address in nfs.cephfs.1.0.compute-2.sseuqb's ganesha conf is defaulting to empty
Dec 06 09:42:07 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:42:07 compute-1 ceph-mon[79770]: Deploying daemon nfs.cephfs.1.0.compute-2.sseuqb on compute-2
Dec 06 09:42:07 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:42:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:08 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:42:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:08 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:42:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:08 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:42:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:08 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:42:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:08 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:42:08 compute-1 ceph-mon[79770]: pgmap v35: 136 pgs: 136 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 161 KiB/s rd, 7.0 KiB/s wr, 301 op/s
Dec 06 09:42:08 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:08 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:08 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:08 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.dfwxck", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Dec 06 09:42:08 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.dfwxck", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Dec 06 09:42:08 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Dec 06 09:42:08 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Dec 06 09:42:08 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:42:09 compute-1 ceph-mon[79770]: Creating key for client.nfs.cephfs.2.0.compute-0.dfwxck
Dec 06 09:42:09 compute-1 ceph-mon[79770]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Dec 06 09:42:09 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Dec 06 09:42:09 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Dec 06 09:42:09 compute-1 ceph-mon[79770]: Rados config object exists: conf-nfs.cephfs
Dec 06 09:42:09 compute-1 ceph-mon[79770]: Creating key for client.nfs.cephfs.2.0.compute-0.dfwxck-rgw
Dec 06 09:42:09 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.dfwxck-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Dec 06 09:42:09 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.dfwxck-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Dec 06 09:42:09 compute-1 ceph-mon[79770]: Bind address in nfs.cephfs.2.0.compute-0.dfwxck's ganesha conf is defaulting to empty
Dec 06 09:42:09 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:42:09 compute-1 ceph-mon[79770]: Deploying daemon nfs.cephfs.2.0.compute-0.dfwxck on compute-0
Dec 06 09:42:09 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:10 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:42:10 compute-1 ceph-mon[79770]: pgmap v36: 136 pgs: 136 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 131 KiB/s rd, 5.7 KiB/s wr, 245 op/s
Dec 06 09:42:10 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:10 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:10 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:42:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:10 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:42:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:10 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:42:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:10 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:42:10 compute-1 sudo[84575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:42:10 compute-1 sudo[84575]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:42:10 compute-1 sudo[84575]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:10 compute-1 sudo[84600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/haproxy:2.3 --timeout 895 _orch deploy --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:42:10 compute-1 sudo[84600]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:42:11 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:11 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:11 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:11 compute-1 ceph-mon[79770]: Deploying daemon haproxy.nfs.cephfs.compute-1.jmdafd on compute-1
Dec 06 09:42:12 compute-1 ceph-mon[79770]: pgmap v37: 136 pgs: 136 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 120 KiB/s rd, 6.0 KiB/s wr, 224 op/s
Dec 06 09:42:12 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:42:15 compute-1 ceph-mon[79770]: pgmap v38: 136 pgs: 136 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 110 KiB/s rd, 3.0 KiB/s wr, 197 op/s
Dec 06 09:42:15 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:16 compute-1 ceph-mon[79770]: pgmap v39: 136 pgs: 136 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 113 KiB/s rd, 4.0 KiB/s wr, 201 op/s
Dec 06 09:42:16 compute-1 podman[84668]: 2025-12-06 09:42:16.067994466 +0000 UTC m=+4.788015484 container create 18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60 (image=quay.io/ceph/haproxy:2.3, name=blissful_edison)
Dec 06 09:42:16 compute-1 podman[84668]: 2025-12-06 09:42:16.041916171 +0000 UTC m=+4.761937229 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec 06 09:42:16 compute-1 systemd[1]: Started libpod-conmon-18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60.scope.
Dec 06 09:42:16 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:42:16 compute-1 podman[84668]: 2025-12-06 09:42:16.165071931 +0000 UTC m=+4.885092939 container init 18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60 (image=quay.io/ceph/haproxy:2.3, name=blissful_edison)
Dec 06 09:42:16 compute-1 podman[84668]: 2025-12-06 09:42:16.174844625 +0000 UTC m=+4.894865633 container start 18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60 (image=quay.io/ceph/haproxy:2.3, name=blissful_edison)
Dec 06 09:42:16 compute-1 podman[84668]: 2025-12-06 09:42:16.178564464 +0000 UTC m=+4.898585472 container attach 18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60 (image=quay.io/ceph/haproxy:2.3, name=blissful_edison)
Dec 06 09:42:16 compute-1 blissful_edison[84789]: 0 0
Dec 06 09:42:16 compute-1 systemd[1]: libpod-18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60.scope: Deactivated successfully.
Dec 06 09:42:16 compute-1 podman[84668]: 2025-12-06 09:42:16.18422647 +0000 UTC m=+4.904247478 container died 18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60 (image=quay.io/ceph/haproxy:2.3, name=blissful_edison)
Dec 06 09:42:16 compute-1 systemd[1]: var-lib-containers-storage-overlay-ad28b025bb456f81cdb6f60f7826e74df3cc72a199b13e204b08a3e31a602e1e-merged.mount: Deactivated successfully.
Dec 06 09:42:16 compute-1 podman[84668]: 2025-12-06 09:42:16.233819868 +0000 UTC m=+4.953840876 container remove 18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60 (image=quay.io/ceph/haproxy:2.3, name=blissful_edison)
Dec 06 09:42:16 compute-1 systemd[1]: libpod-conmon-18e65341a7952f39d21229343d78490f489e269c94cfb809cd6cb5b054a35b60.scope: Deactivated successfully.
Dec 06 09:42:16 compute-1 systemd[1]: Reloading.
Dec 06 09:42:16 compute-1 systemd-rc-local-generator[84840]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:42:16 compute-1 systemd-sysv-generator[84843]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:42:16 compute-1 systemd[1]: Reloading.
Dec 06 09:42:16 compute-1 systemd-sysv-generator[84884]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:42:16 compute-1 systemd-rc-local-generator[84880]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:42:16 compute-1 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-1.jmdafd for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:42:17 compute-1 podman[84935]: 2025-12-06 09:42:17.1299086 +0000 UTC m=+0.042023998 container create 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec 06 09:42:17 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e32abe081b2708d90d7e7598309a913b00ba5c2a87ffd8d8b498ae51bb15565/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Dec 06 09:42:17 compute-1 podman[84935]: 2025-12-06 09:42:17.20088771 +0000 UTC m=+0.113003198 container init 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec 06 09:42:17 compute-1 podman[84935]: 2025-12-06 09:42:17.111049128 +0000 UTC m=+0.023164546 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Dec 06 09:42:17 compute-1 podman[84935]: 2025-12-06 09:42:17.210452989 +0000 UTC m=+0.122568417 container start 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec 06 09:42:17 compute-1 bash[84935]: 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0
Dec 06 09:42:17 compute-1 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-1.jmdafd for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:42:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [NOTICE] 339/094217 (2) : New worker #1 (4) forked
Dec 06 09:42:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:17 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb24000df0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:17 compute-1 sudo[84600]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:17 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:42:18 compute-1 ceph-mon[79770]: pgmap v40: 136 pgs: 136 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 5.1 KiB/s rd, 1.8 KiB/s wr, 7 op/s
Dec 06 09:42:18 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:18 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:18 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:18 compute-1 ceph-mon[79770]: Deploying daemon haproxy.nfs.cephfs.compute-0.fzuvue on compute-0
Dec 06 09:42:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:19 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18001c00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:20 compute-1 ceph-mon[79770]: pgmap v41: 136 pgs: 136 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 5.1 KiB/s rd, 1.8 KiB/s wr, 7 op/s
Dec 06 09:42:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:21 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc000b60 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:22 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb20001110 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:22 compute-1 ceph-mon[79770]: pgmap v42: 136 pgs: 136 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 5.1 KiB/s rd, 1.8 KiB/s wr, 7 op/s
Dec 06 09:42:22 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:22 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:22 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:42:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:23 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00000fa0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:23 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:23 compute-1 ceph-mon[79770]: Deploying daemon haproxy.nfs.cephfs.compute-2.voodna on compute-2
Dec 06 09:42:23 compute-1 ceph-mon[79770]: pgmap v43: 136 pgs: 136 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 1023 B/s wr, 4 op/s
Dec 06 09:42:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:24 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:25 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:26 compute-1 ceph-mon[79770]: pgmap v44: 136 pgs: 136 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1023 B/s wr, 4 op/s
Dec 06 09:42:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:26 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb20001eb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:27 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:27 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:42:28 compute-1 sudo[84968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:42:28 compute-1 sudo[84968]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:42:28 compute-1 sudo[84968]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:28 compute-1 sudo[84993]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/keepalived:2.2.4 --timeout 895 _orch deploy --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:42:28 compute-1 sudo[84993]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:42:28 compute-1 ceph-mon[79770]: pgmap v45: 136 pgs: 136 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:42:28 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:28 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:28 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:28 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:28 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:29 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb20001eb0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e52 e52: 3 total, 3 up, 3 in
Dec 06 09:42:29 compute-1 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec 06 09:42:29 compute-1 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec 06 09:42:29 compute-1 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec 06 09:42:29 compute-1 ceph-mon[79770]: Deploying daemon keepalived.nfs.cephfs.compute-1.uzbtlt on compute-1
Dec 06 09:42:29 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]: dispatch
Dec 06 09:42:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:29 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:30 compute-1 ceph-mon[79770]: pgmap v46: 136 pgs: 136 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:42:30 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "16"}]': finished
Dec 06 09:42:30 compute-1 ceph-mon[79770]: osdmap e52: 3 total, 3 up, 3 in
Dec 06 09:42:30 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Dec 06 09:42:30 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Dec 06 09:42:30 compute-1 ceph-mon[79770]: osdmap e53: 3 total, 3 up, 3 in
Dec 06 09:42:30 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e53 e53: 3 total, 3 up, 3 in
Dec 06 09:42:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:30 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:31 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:31 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e54 e54: 3 total, 3 up, 3 in
Dec 06 09:42:31 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 54 pg[6.0( v 50'39 (0'0,50'39] local-lis/les=21/23 n=22 ec=21/21 lis/c=21/21 les/c/f=23/23/0 sis=54 pruub=13.692863464s) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 48'38 mlcod 48'38 active pruub 189.167221069s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:31 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 54 pg[6.0( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=21/21 lis/c=21/21 les/c/f=23/23/0 sis=54 pruub=13.692863464s) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 48'38 mlcod 0'0 unknown pruub 189.167221069s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:31 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb20002bc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:32 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00001ac0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:32 compute-1 podman[85059]: 2025-12-06 09:42:32.714945051 +0000 UTC m=+4.170299738 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec 06 09:42:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e55 e55: 3 total, 3 up, 3 in
Dec 06 09:42:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Dec 06 09:42:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 06 09:42:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]: dispatch
Dec 06 09:42:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Dec 06 09:42:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Dec 06 09:42:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "16"}]': finished
Dec 06 09:42:32 compute-1 ceph-mon[79770]: osdmap e54: 3 total, 3 up, 3 in
Dec 06 09:42:32 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Dec 06 09:42:32 compute-1 podman[85059]: 2025-12-06 09:42:32.73659834 +0000 UTC m=+4.191952997 container create c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c (image=quay.io/ceph/keepalived:2.2.4, name=modest_pike, description=keepalived for Ceph, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, name=keepalived, vcs-type=git, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, version=2.2.4, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9)
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.c( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.8( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.9( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.a( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.e( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.f( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.5( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.2( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.3( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.4( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.7( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.6( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.1( v 50'39 (0'0,50'39] local-lis/les=21/23 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.d( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.b( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=21/23 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.8( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.c( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.9( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.e( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.f( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.2( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.5( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.0( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=21/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 48'38 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.3( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.4( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.7( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.6( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.1( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.d( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.b( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 55 pg[6.a( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=21/21 les/c/f=23/23/0 sis=54) [0] r=0 lpr=54 pi=[21,54)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:32 compute-1 systemd[1]: Started libpod-conmon-c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c.scope.
Dec 06 09:42:32 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:42:32 compute-1 podman[85059]: 2025-12-06 09:42:32.829632068 +0000 UTC m=+4.284986765 container init c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c (image=quay.io/ceph/keepalived:2.2.4, name=modest_pike, description=keepalived for Ceph, name=keepalived, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, com.redhat.component=keepalived-container, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, version=2.2.4)
Dec 06 09:42:32 compute-1 podman[85059]: 2025-12-06 09:42:32.839327551 +0000 UTC m=+4.294682208 container start c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c (image=quay.io/ceph/keepalived:2.2.4, name=modest_pike, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, name=keepalived, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Dec 06 09:42:32 compute-1 podman[85059]: 2025-12-06 09:42:32.843305636 +0000 UTC m=+4.298660313 container attach c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c (image=quay.io/ceph/keepalived:2.2.4, name=modest_pike, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, distribution-scope=public, description=keepalived for Ceph, name=keepalived, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived, architecture=x86_64, version=2.2.4)
Dec 06 09:42:32 compute-1 modest_pike[85157]: 0 0
Dec 06 09:42:32 compute-1 systemd[1]: libpod-c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c.scope: Deactivated successfully.
Dec 06 09:42:32 compute-1 podman[85059]: 2025-12-06 09:42:32.849375501 +0000 UTC m=+4.304730158 container died c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c (image=quay.io/ceph/keepalived:2.2.4, name=modest_pike, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, name=keepalived, build-date=2023-02-22T09:23:20, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 06 09:42:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e55 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:42:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-d3df7fa136e824db67690e4f52b9f64af4b6569d1a91c442596b9d250f39f423-merged.mount: Deactivated successfully.
Dec 06 09:42:32 compute-1 podman[85059]: 2025-12-06 09:42:32.995666445 +0000 UTC m=+4.451021102 container remove c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c (image=quay.io/ceph/keepalived:2.2.4, name=modest_pike, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.buildah.version=1.28.2, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, com.redhat.component=keepalived-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, release=1793, architecture=x86_64, name=keepalived)
Dec 06 09:42:33 compute-1 systemd[1]: libpod-conmon-c79cdff044c61ed5770be1cce73d6f5b86e7b013e2720dae646f052e887cf12c.scope: Deactivated successfully.
Dec 06 09:42:33 compute-1 systemd[1]: Reloading.
Dec 06 09:42:33 compute-1 systemd-rc-local-generator[85204]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:42:33 compute-1 systemd-sysv-generator[85207]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:42:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:33 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc0016a0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:33 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.c scrub starts
Dec 06 09:42:33 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.c scrub ok
Dec 06 09:42:33 compute-1 systemd[1]: Reloading.
Dec 06 09:42:33 compute-1 systemd-sysv-generator[85249]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:42:33 compute-1 systemd-rc-local-generator[85245]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:42:33 compute-1 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-1.uzbtlt for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:42:33 compute-1 ceph-mon[79770]: pgmap v49: 136 pgs: 136 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 383 B/s rd, 0 op/s
Dec 06 09:42:33 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Dec 06 09:42:33 compute-1 ceph-mon[79770]: osdmap e55: 3 total, 3 up, 3 in
Dec 06 09:42:33 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Dec 06 09:42:33 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 06 09:42:33 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 06 09:42:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e56 e56: 3 total, 3 up, 3 in
Dec 06 09:42:33 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 56 pg[8.0( v 51'44 (0'0,51'44] local-lis/les=40/41 n=5 ec=40/40 lis/c=40/40 les/c/f=41/41/0 sis=56 pruub=12.352684975s) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 51'43 mlcod 51'43 active pruub 190.152496338s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:33 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 56 pg[9.0( v 44'12 (0'0,44'12] local-lis/les=43/44 n=6 ec=43/43 lis/c=43/43 les/c/f=44/44/0 sis=56 pruub=14.953887939s) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 44'11 mlcod 44'11 active pruub 192.753723145s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:33 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 56 pg[9.0( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=43/43 lis/c=43/43 les/c/f=44/44/0 sis=56 pruub=14.953887939s) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 44'11 mlcod 0'0 unknown pruub 192.753723145s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:33 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 56 pg[8.0( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=40/40 lis/c=40/40 les/c/f=41/41/0 sis=56 pruub=12.352684975s) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 51'43 mlcod 0'0 unknown pruub 190.152496338s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:33 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0).collection(8.0_head 0x55fb252cc6c0) operator()   moving buffer(0x55fb24a7f7e8 space 0x55fb24ac2760 0x0~1000 clean)
Dec 06 09:42:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:33 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:34 compute-1 podman[85303]: 2025-12-06 09:42:34.019204561 +0000 UTC m=+0.048339699 container create c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=keepalived, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, io.buildah.version=1.28.2, vcs-type=git, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public)
Dec 06 09:42:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5954e0c88d233d83f2fcfd99401ef442e34ef24b6527071b725e5489ae056436/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:42:34 compute-1 podman[85303]: 2025-12-06 09:42:34.080313145 +0000 UTC m=+0.109448313 container init c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, name=keepalived, vendor=Red Hat, Inc., vcs-type=git, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, distribution-scope=public, release=1793, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.28.2, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=Ceph keepalived)
Dec 06 09:42:34 compute-1 podman[85303]: 2025-12-06 09:42:34.087071147 +0000 UTC m=+0.116206285 container start c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, release=1793, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, name=keepalived, vendor=Red Hat, Inc., version=2.2.4, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64)
Dec 06 09:42:34 compute-1 bash[85303]: c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1
Dec 06 09:42:34 compute-1 podman[85303]: 2025-12-06 09:42:34.001242331 +0000 UTC m=+0.030377489 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Dec 06 09:42:34 compute-1 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-1.uzbtlt for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:42:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: Starting Keepalived v2.2.4 (08/21,2021)
Dec 06 09:42:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: Running on Linux 5.14.0-645.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025 (built for Linux 5.14.0)
Dec 06 09:42:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Dec 06 09:42:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: Configuration file /etc/keepalived/keepalived.conf
Dec 06 09:42:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Dec 06 09:42:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: Starting VRRP child process, pid=4
Dec 06 09:42:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: Startup complete
Dec 06 09:42:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: (VI_0) Entering BACKUP STATE (init)
Dec 06 09:42:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:34 2025: VRRP_Script(check_backend) succeeded
Dec 06 09:42:34 compute-1 sudo[84993]: pam_unix(sudo:session): session closed for user root
Dec 06 09:42:34 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Dec 06 09:42:34 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Dec 06 09:42:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:34 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb20002bc0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:34 compute-1 ceph-mon[79770]: 7.16 scrub starts
Dec 06 09:42:34 compute-1 ceph-mon[79770]: 7.16 scrub ok
Dec 06 09:42:34 compute-1 ceph-mon[79770]: pgmap v52: 182 pgs: 46 unknown, 136 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:42:34 compute-1 ceph-mon[79770]: 6.c scrub starts
Dec 06 09:42:34 compute-1 ceph-mon[79770]: 6.c scrub ok
Dec 06 09:42:34 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Dec 06 09:42:34 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Dec 06 09:42:34 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Dec 06 09:42:34 compute-1 ceph-mon[79770]: osdmap e56: 3 total, 3 up, 3 in
Dec 06 09:42:34 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Dec 06 09:42:34 compute-1 ceph-mon[79770]: 7.c scrub starts
Dec 06 09:42:34 compute-1 ceph-mon[79770]: 7.c scrub ok
Dec 06 09:42:34 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:34 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:34 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e57 e57: 3 total, 3 up, 3 in
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1e( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.19( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.18( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1f( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.17( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.16( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.17( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.16( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.11( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.3( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.2( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.4( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.10( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.7( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.5( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.6( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.12( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.13( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.12( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1d( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1c( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1d( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.13( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1c( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1f( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1e( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.18( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.19( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1a( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1a( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.5( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.4( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1b( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1b( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.6( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.7( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1( v 44'12 (0'0,44'12] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1( v 51'44 (0'0,51'44] local-lis/les=40/41 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.a( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.b( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.c( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.d( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.d( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.e( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.f( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.a( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.b( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.9( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.8( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.9( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.8( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.c( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.e( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.2( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.f( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.3( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.10( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.11( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.14( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.15( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.15( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=43/44 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.14( v 51'44 lc 0'0 (0'0,51'44] local-lis/les=40/41 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1e( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.19( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.16( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.17( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.16( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.17( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.18( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.3( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1f( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.11( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.2( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.4( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.5( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.13( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.12( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.6( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.10( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.7( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.12( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1c( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1d( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.13( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1d( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1c( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1f( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1e( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.18( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1a( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.4( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1a( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1b( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.5( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.19( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.6( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.7( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.0( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=40/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 51'43 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.1b( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.0( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=43/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 44'11 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.1( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.a( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.c( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.b( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.d( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.d( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.e( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.f( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.b( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.a( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.8( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.8( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.9( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.9( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.2( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.e( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.3( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.14( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.11( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.10( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.15( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.14( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.c( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[8.f( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=40/40 les/c/f=41/41/0 sis=56) [0] r=0 lpr=56 pi=[40,56)/1 crt=51'44 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:34 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 57 pg[9.15( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=43/43 les/c/f=44/44/0 sis=56) [0] r=0 lpr=56 pi=[43,56)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:35 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:35 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.9 scrub starts
Dec 06 09:42:35 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.9 scrub ok
Dec 06 09:42:35 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e58 e58: 3 total, 3 up, 3 in
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.10( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.18( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.1e( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.9( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.13( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.b( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.8( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.f( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.e( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.4( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.3( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.2( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.6( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[7.1b( empty local-lis/les=0/0 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.14( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.993197441s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.830062866s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.14( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.993165016s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.830062866s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.10( v 57'45 (0'0,57'45] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992980957s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 51'44 mlcod 51'44 active pruub 194.830017090s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.10( v 57'45 (0'0,57'45] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992946625s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 51'44 mlcod 0'0 unknown NOTIFY pruub 194.830017090s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.11( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992822647s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.830001831s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.15( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992801666s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.830017090s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[11.0( v 48'48 (0'0,48'48] local-lis/les=47/48 n=8 ec=47/47 lis/c=47/47 les/c/f=48/48/0 sis=58 pruub=9.013495445s) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 48'47 mlcod 48'47 active pruub 188.850738525s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.11( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992766380s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.830001831s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.d( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.961063385s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.798477173s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.3( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992439270s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829940796s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.15( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992457390s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.830017090s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.3( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.992345810s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829940796s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.1( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.960508347s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.798431396s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.1( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.960480690s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.798431396s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.f( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991816521s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829956055s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.f( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991789818s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829956055s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.e( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991558075s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.829940796s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.8( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991323471s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829711914s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.8( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991302490s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829711914s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.9( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991175652s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.829757690s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.9( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991153717s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829757690s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec 06 09:42:35 compute-1 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec 06 09:42:35 compute-1 ceph-mon[79770]: Deploying daemon keepalived.nfs.cephfs.compute-0.ylrrzf on compute-0
Dec 06 09:42:35 compute-1 ceph-mon[79770]: 6.8 scrub starts
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.7( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.959497452s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.798324585s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.7( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.959472656s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.798324585s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-mon[79770]: 6.8 scrub ok
Dec 06 09:42:35 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Dec 06 09:42:35 compute-1 ceph-mon[79770]: osdmap e57: 3 total, 3 up, 3 in
Dec 06 09:42:35 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.9( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.990754128s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829757690s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.9( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.990729332s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829757690s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-mon[79770]: 7.15 scrub starts
Dec 06 09:42:35 compute-1 ceph-mon[79770]: 7.15 scrub ok
Dec 06 09:42:35 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 06 09:42:35 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.8( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.990464211s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.829650879s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.8( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.990440369s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829650879s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 06 09:42:35 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 06 09:42:35 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]: dispatch
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.a( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.990021706s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829620361s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.a( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.989954948s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829620361s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.b( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.989780426s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.829589844s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.b( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.989757538s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829589844s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.f( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.989526749s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.829559326s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.f( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.989503860s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829559326s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.e( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.991530418s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829940796s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.3( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.957810402s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.798278809s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.3( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.957788467s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.798278809s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.d( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.988817215s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829467773s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.d( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.988798141s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829467773s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.15( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.989374161s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.830276489s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.c( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.988478661s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829452515s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.c( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.988456726s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829452515s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.15( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.989293098s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.830276489s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.d( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.960991859s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.798477173s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[11.0( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=47/47 lis/c=47/47 les/c/f=48/48/0 sis=58 pruub=9.013495445s) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 48'47 mlcod 0'0 unknown pruub 188.850738525s@ mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.5( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.956918716s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.798202515s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.5( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.956892014s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.798202515s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.b( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.988080025s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.829467773s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.a( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.987852097s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.829345703s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.a( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.987829208s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829345703s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.d( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.987977028s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.829528809s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.d( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.987812996s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829528809s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.b( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.988037109s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.829467773s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.f( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.956246376s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.798141479s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.f( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.956223488s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.798141479s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.9( v 50'39 (0'0,50'39] local-lis/les=54/55 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.955714226s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.797897339s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.6( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.986533165s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.828842163s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.9( v 50'39 (0'0,50'39] local-lis/les=54/55 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.955690384s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.797897339s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.6( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.986492157s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.828842163s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.4( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.986289024s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.828735352s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.4( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.986240387s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.828735352s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.1b( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.986190796s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.828781128s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.1b( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.986135483s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.828781128s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.5( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.985842705s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.828796387s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.19( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.985840797s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.828826904s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.19( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.985815048s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.828826904s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.18( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.979178429s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.822219849s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.18( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.979141235s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.822219849s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.1c( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978680611s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.822006226s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.1c( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978662491s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.822006226s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.1d( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978631020s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.822036743s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.1d( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978603363s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.822036743s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.12( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978453636s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.821975708s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.12( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978426933s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821975708s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.12( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978094101s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.821838379s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.6( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978119850s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.821884155s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.6( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978103638s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821884155s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.13( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978011131s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.821823120s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.12( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.978046417s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821838379s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.13( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977977753s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821823120s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.7( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977997780s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.821929932s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.7( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977980614s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821929932s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.5( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977673531s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.821838379s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.2( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977571487s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.821746826s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.2( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977553368s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821746826s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.5( v 51'44 (0'0,51'44] local-lis/les=56/57 n=1 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977593422s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821838379s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.3( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.975193024s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.819534302s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.3( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.975163460s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.819534302s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.5( v 44'12 (0'0,44'12] local-lis/les=56/57 n=1 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.985816956s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.828796387s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.11( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977272987s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.821731567s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.10( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977396965s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.821929932s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.11( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977242470s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821731567s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.10( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.977380753s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.821929932s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.16( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974814415s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.819503784s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.17( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974786758s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.819503784s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.16( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974771500s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.819503784s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.17( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974769592s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.819503784s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.17( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974672318s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.819473267s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.17( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974642754s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.819473267s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.18( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974593163s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.819519043s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.18( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974575996s) [1] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.819519043s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.1f( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974564552s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 active pruub 194.819549561s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[8.1f( v 51'44 (0'0,51'44] local-lis/les=56/57 n=0 ec=56/40 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.974538803s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=51'44 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.819549561s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.16( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.973692894s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 active pruub 194.819473267s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[9.16( v 44'12 (0'0,44'12] local-lis/les=56/57 n=0 ec=56/43 lis/c=56/56 les/c/f=57/57/0 sis=58 pruub=14.973628998s) [2] r=-1 lpr=58 pi=[56,58)/1 crt=44'12 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.819473267s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.b( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.952226639s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 192.798522949s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:35 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 58 pg[6.b( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=58 pruub=12.952185631s) [2] r=-1 lpr=58 pi=[54,58)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 192.798522949s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:35 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00002f50 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:36 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.14 scrub starts
Dec 06 09:42:36 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.14 scrub ok
Dec 06 09:42:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:36 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:36 compute-1 ceph-mon[79770]: pgmap v55: 244 pgs: 244 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 09:42:36 compute-1 ceph-mon[79770]: 6.9 scrub starts
Dec 06 09:42:36 compute-1 ceph-mon[79770]: 6.9 scrub ok
Dec 06 09:42:36 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Dec 06 09:42:36 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Dec 06 09:42:36 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Dec 06 09:42:36 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 06 09:42:36 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 06 09:42:36 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 06 09:42:36 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "2"}]': finished
Dec 06 09:42:36 compute-1 ceph-mon[79770]: osdmap e58: 3 total, 3 up, 3 in
Dec 06 09:42:36 compute-1 ceph-mon[79770]: 7.1c scrub starts
Dec 06 09:42:36 compute-1 ceph-mon[79770]: 7.1c scrub ok
Dec 06 09:42:36 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e59 e59: 3 total, 3 up, 3 in
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.17( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.16( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.13( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.c( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.b( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.a( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.9( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.d( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.e( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.f( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.8( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.2( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.3( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.4( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.7( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.18( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.19( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1a( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1d( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1e( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1f( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.10( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.11( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.5( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.6( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1( v 48'48 (0'0,48'48] local-lis/les=47/48 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.12( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.15( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.14( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1b( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1c( v 48'48 lc 0'0 (0'0,48'48] local-lis/les=47/48 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.17( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.1b( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.13( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.0( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=47/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 48'47 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.c( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.16( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.b( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.a( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.6( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.9( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.d( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.2( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.e( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.3( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.f( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.4( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.e( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.8( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.2( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.f( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.8( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.3( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.4( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.b( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.18( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.7( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.19( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1d( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1a( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1e( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1f( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.13( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.10( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.9( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.11( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.5( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.6( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.1e( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.18( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.12( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.14( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[7.10( empty local-lis/les=58/59 n=0 ec=54/22 lis/c=54/54 les/c/f=55/55/0 sis=58) [0] r=0 lpr=58 pi=[54,58)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1b( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.1c( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 59 pg[11.15( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=47/47 les/c/f=48/48/0 sis=58) [0] r=0 lpr=58 pi=[47,58)/1 crt=48'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:37 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200038d0 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:37 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.2 scrub starts
Dec 06 09:42:37 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.2 scrub ok
Dec 06 09:42:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:37 2025: (VI_0) Entering MASTER STATE
Dec 06 09:42:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:37 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc002f00 fd 37 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e59 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:42:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e60 e60: 3 total, 3 up, 3 in
Dec 06 09:42:38 compute-1 ceph-mon[79770]: 9.14 scrub starts
Dec 06 09:42:38 compute-1 ceph-mon[79770]: 9.14 scrub ok
Dec 06 09:42:38 compute-1 ceph-mon[79770]: 7.12 scrub starts
Dec 06 09:42:38 compute-1 ceph-mon[79770]: 7.12 scrub ok
Dec 06 09:42:38 compute-1 ceph-mon[79770]: osdmap e59: 3 total, 3 up, 3 in
Dec 06 09:42:38 compute-1 ceph-mon[79770]: pgmap v58: 306 pgs: 62 unknown, 244 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 09:42:38 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Dec 06 09:42:38 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Dec 06 09:42:38 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Dec 06 09:42:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:38 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00002f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e61 e61: 3 total, 3 up, 3 in
Dec 06 09:42:39 compute-1 ceph-mon[79770]: 9.2 scrub starts
Dec 06 09:42:39 compute-1 ceph-mon[79770]: 9.2 scrub ok
Dec 06 09:42:39 compute-1 ceph-mon[79770]: 7.17 deep-scrub starts
Dec 06 09:42:39 compute-1 ceph-mon[79770]: 7.17 deep-scrub ok
Dec 06 09:42:39 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Dec 06 09:42:39 compute-1 ceph-mon[79770]: osdmap e60: 3 total, 3 up, 3 in
Dec 06 09:42:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:39 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:39 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Dec 06 09:42:39 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Dec 06 09:42:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:39 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200038d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:40 compute-1 ceph-mon[79770]: 6.6 scrub starts
Dec 06 09:42:40 compute-1 ceph-mon[79770]: 6.6 scrub ok
Dec 06 09:42:40 compute-1 ceph-mon[79770]: 7.0 deep-scrub starts
Dec 06 09:42:40 compute-1 ceph-mon[79770]: 7.0 deep-scrub ok
Dec 06 09:42:40 compute-1 ceph-mon[79770]: pgmap v60: 337 pgs: 93 unknown, 244 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 725 B/s rd, 0 op/s
Dec 06 09:42:40 compute-1 ceph-mon[79770]: osdmap e61: 3 total, 3 up, 3 in
Dec 06 09:42:40 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:40 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:40 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:40 compute-1 ceph-mon[79770]: 6.f scrub starts
Dec 06 09:42:40 compute-1 ceph-mon[79770]: 6.f scrub ok
Dec 06 09:42:40 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:40 compute-1 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec 06 09:42:40 compute-1 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec 06 09:42:40 compute-1 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Dec 06 09:42:40 compute-1 ceph-mon[79770]: Deploying daemon keepalived.nfs.cephfs.compute-2.whsrlg on compute-2
Dec 06 09:42:40 compute-1 ceph-mon[79770]: 6.4 scrub starts
Dec 06 09:42:40 compute-1 ceph-mon[79770]: 6.4 scrub ok
Dec 06 09:42:40 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.0 scrub starts
Dec 06 09:42:40 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.0 scrub ok
Dec 06 09:42:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:40 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc002f00 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:41 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e62 e62: 3 total, 3 up, 3 in
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.1c( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.12( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.6( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.19( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.8( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.a( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.e( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.c( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.b( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[12.10( empty local-lis/les=0/0 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.17( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.806502342s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.025405884s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.17( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.806447029s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.025405884s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.16( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.811082840s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.030075073s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.16( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.811053276s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.030075073s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.13( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.810568810s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.030014038s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.13( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.810263634s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.030014038s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.6( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.578481674s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 200.798706055s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.a( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.810064316s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.030303955s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.6( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.578426361s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 200.798706055s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.a( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.810020447s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.030303955s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.e( v 61'51 (0'0,61'51] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.809412003s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=59'49 lcod 59'50 mlcod 59'50 active pruub 197.030380249s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.f( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.809444427s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.030410767s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.2( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.577353477s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 200.798385620s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.f( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.809404373s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.030410767s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.e( v 61'51 (0'0,61'51] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.809345245s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=59'49 lcod 59'50 mlcod 0'0 unknown NOTIFY pruub 197.030380249s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.2( v 50'39 (0'0,50'39] local-lis/les=54/55 n=2 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.577314377s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 200.798385620s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.8( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.809409142s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.030548096s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.8( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.809364319s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.030548096s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.4( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.809011459s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.030593872s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.e( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.576561928s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 200.798171997s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.4( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.808938026s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.030593872s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.3( v 61'51 (0'0,61'51] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.808858871s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=59'49 lcod 59'50 mlcod 59'50 active pruub 197.030593872s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.e( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.576468468s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 200.798171997s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.3( v 61'51 (0'0,61'51] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.808789253s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=59'49 lcod 59'50 mlcod 0'0 unknown NOTIFY pruub 197.030593872s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.7( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.814203262s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036163330s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.7( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.814068794s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036163330s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.a( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.576579094s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 200.798721313s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.19( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.814126968s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036315918s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[6.a( v 50'39 (0'0,50'39] local-lis/les=54/55 n=1 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=62 pruub=15.576539040s) [1] r=-1 lpr=62 pi=[54,62)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 200.798721313s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.19( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.814093590s) [2] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036315918s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1d( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813888550s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036331177s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1d( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813417435s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036331177s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813626289s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036590576s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-mon[79770]: 7.7 scrub starts
Dec 06 09:42:41 compute-1 ceph-mon[79770]: 7.7 scrub ok
Dec 06 09:42:41 compute-1 ceph-mon[79770]: 8.d scrub starts
Dec 06 09:42:41 compute-1 ceph-mon[79770]: 8.d scrub ok
Dec 06 09:42:41 compute-1 ceph-mon[79770]: 6.0 scrub starts
Dec 06 09:42:41 compute-1 ceph-mon[79770]: 6.0 scrub ok
Dec 06 09:42:41 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]: dispatch
Dec 06 09:42:41 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 06 09:42:41 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Dec 06 09:42:41 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813585281s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036590576s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1e( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813253403s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036346436s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1e( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813191414s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036346436s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.12( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813434601s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036651611s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.12( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813399315s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036651611s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.14( v 61'51 (0'0,61'51] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813307762s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=59'49 lcod 59'50 mlcod 59'50 active pruub 197.036697388s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.14( v 61'51 (0'0,61'51] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813238144s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=59'49 lcod 59'50 mlcod 0'0 unknown NOTIFY pruub 197.036697388s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1b( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813173294s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036697388s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1b( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.813140869s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036697388s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1a( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.812614441s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036331177s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1a( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.812572479s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036331177s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1c( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.812896729s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036743164s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.1c( v 48'48 (0'0,48'48] local-lis/les=58/59 n=0 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.812850952s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036743164s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.5( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.812460899s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 active pruub 197.036560059s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:41 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 62 pg[11.5( v 48'48 (0'0,48'48] local-lis/les=58/59 n=1 ec=58/47 lis/c=58/58 les/c/f=59/59/0 sis=62 pruub=11.812422752s) [1] r=-1 lpr=62 pi=[58,62)/1 crt=48'48 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 197.036560059s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:41 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00002f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:41 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.e scrub starts
Dec 06 09:42:41 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.e scrub ok
Dec 06 09:42:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:41 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00002f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e63 e63: 3 total, 3 up, 3 in
Dec 06 09:42:42 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.10( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:42 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.c( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:42 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.a( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:42 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.b( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:42 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.e( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:42 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.8( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:42 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.19( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:42 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.6( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:42 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.12( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:42 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 63 pg[12.1c( empty local-lis/les=62/63 n=0 ec=60/49 lis/c=60/60 les/c/f=61/61/0 sis=62) [0] r=0 lpr=62 pi=[60,62)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:42 compute-1 ceph-mon[79770]: 7.1 scrub starts
Dec 06 09:42:42 compute-1 ceph-mon[79770]: 7.1 scrub ok
Dec 06 09:42:42 compute-1 ceph-mon[79770]: pgmap v62: 337 pgs: 337 active+clean; 457 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 63 B/s, 1 keys/s, 3 objects/s recovering
Dec 06 09:42:42 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "3"}]': finished
Dec 06 09:42:42 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 06 09:42:42 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Dec 06 09:42:42 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 06 09:42:42 compute-1 ceph-mon[79770]: osdmap e62: 3 total, 3 up, 3 in
Dec 06 09:42:42 compute-1 ceph-mon[79770]: 9.18 scrub starts
Dec 06 09:42:42 compute-1 ceph-mon[79770]: 9.18 scrub ok
Dec 06 09:42:42 compute-1 ceph-mon[79770]: 8.e scrub starts
Dec 06 09:42:42 compute-1 ceph-mon[79770]: 8.e scrub ok
Dec 06 09:42:42 compute-1 ceph-mon[79770]: osdmap e63: 3 total, 3 up, 3 in
Dec 06 09:42:42 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.c deep-scrub starts
Dec 06 09:42:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:42 2025: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Dec 06 09:42:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt[85318]: Sat Dec  6 09:42:42 2025: (VI_0) Entering BACKUP STATE
Dec 06 09:42:42 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.c deep-scrub ok
Dec 06 09:42:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:42 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200038d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e63 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:42:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e64 e64: 3 total, 3 up, 3 in
Dec 06 09:42:43 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.16( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:43 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.2( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:43 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.e( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:43 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.a( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:43 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.6( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:43 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:43 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.12( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:43 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 64 pg[10.1a( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=64) [0] r=0 lpr=64 pi=[58,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:43 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:43 compute-1 ceph-mon[79770]: 7.d scrub starts
Dec 06 09:42:43 compute-1 ceph-mon[79770]: 7.d scrub ok
Dec 06 09:42:43 compute-1 ceph-mon[79770]: 8.16 scrub starts
Dec 06 09:42:43 compute-1 ceph-mon[79770]: 8.16 scrub ok
Dec 06 09:42:43 compute-1 ceph-mon[79770]: 9.c deep-scrub starts
Dec 06 09:42:43 compute-1 ceph-mon[79770]: 9.c deep-scrub ok
Dec 06 09:42:43 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]: dispatch
Dec 06 09:42:43 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Dec 06 09:42:43 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "4"}]': finished
Dec 06 09:42:43 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Dec 06 09:42:43 compute-1 ceph-mon[79770]: osdmap e64: 3 total, 3 up, 3 in
Dec 06 09:42:43 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.0 scrub starts
Dec 06 09:42:43 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.0 scrub ok
Dec 06 09:42:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:43 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00002f50 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:44 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e65 e65: 3 total, 3 up, 3 in
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.2( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.e( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.16( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.2( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.6( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.e( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.a( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.16( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.a( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.6( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.12( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.12( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.1a( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.1a( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 65 pg[10.1e( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=65) [0]/[1] r=-1 lpr=65 pi=[58,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:42:44 compute-1 ceph-mon[79770]: 7.19 deep-scrub starts
Dec 06 09:42:44 compute-1 ceph-mon[79770]: 7.19 deep-scrub ok
Dec 06 09:42:44 compute-1 ceph-mon[79770]: pgmap v65: 337 pgs: 337 active+clean; 457 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 66 B/s, 1 keys/s, 3 objects/s recovering
Dec 06 09:42:44 compute-1 ceph-mon[79770]: 9.0 scrub starts
Dec 06 09:42:44 compute-1 ceph-mon[79770]: 9.0 scrub ok
Dec 06 09:42:44 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:44 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:44 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:44 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:44 compute-1 ceph-mon[79770]: osdmap e65: 3 total, 3 up, 3 in
Dec 06 09:42:44 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:44 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1 deep-scrub starts
Dec 06 09:42:44 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1 deep-scrub ok
Dec 06 09:42:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:44 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:45 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e66 e66: 3 total, 3 up, 3 in
Dec 06 09:42:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:45 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200038d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:45 compute-1 ceph-mon[79770]: Deploying daemon alertmanager.compute-0 on compute-0
Dec 06 09:42:45 compute-1 ceph-mon[79770]: 9.1 deep-scrub starts
Dec 06 09:42:45 compute-1 ceph-mon[79770]: 9.1 deep-scrub ok
Dec 06 09:42:45 compute-1 ceph-mon[79770]: osdmap e66: 3 total, 3 up, 3 in
Dec 06 09:42:45 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e67 e67: 3 total, 3 up, 3 in
Dec 06 09:42:45 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 67 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=67) [0] r=0 lpr=67 pi=[58,67)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:45 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 67 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=67) [0] r=0 lpr=67 pi=[58,67)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:45 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:46 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e68 e68: 3 total, 3 up, 3 in
Dec 06 09:42:46 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:46 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:46 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:46 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:46 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:46 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:46 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:46 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:46 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.2( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:46 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.2( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:46 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:46 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:46 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:42:46 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:42:46 compute-1 ceph-mon[79770]: pgmap v68: 337 pgs: 8 remapped+peering, 16 peering, 1 active+recovering, 312 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 3.7 KiB/s rd, 3 op/s; 1/226 objects misplaced (0.442%); 722 B/s, 2 keys/s, 23 objects/s recovering
Dec 06 09:42:46 compute-1 ceph-mon[79770]: osdmap e67: 3 total, 3 up, 3 in
Dec 06 09:42:46 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 68 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=67/68 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=67) [0] r=0 lpr=67 pi=[58,67)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:46 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.1a scrub starts
Dec 06 09:42:46 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.1a scrub ok
Dec 06 09:42:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:46 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:47 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002520 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e69 e69: 3 total, 3 up, 3 in
Dec 06 09:42:47 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 69 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:47 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 69 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:47 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 69 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:47 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 69 pg[10.2( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:47 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 69 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:47 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 69 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:47 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 69 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=65/58 les/c/f=66/59/0 sis=68) [0] r=0 lpr=68 pi=[58,68)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:42:47 compute-1 ceph-mon[79770]: 10.1f scrub starts
Dec 06 09:42:47 compute-1 ceph-mon[79770]: 10.1f scrub ok
Dec 06 09:42:47 compute-1 ceph-mon[79770]: osdmap e68: 3 total, 3 up, 3 in
Dec 06 09:42:47 compute-1 ceph-mon[79770]: 10.1a scrub starts
Dec 06 09:42:47 compute-1 ceph-mon[79770]: 10.1a scrub ok
Dec 06 09:42:47 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.16 scrub starts
Dec 06 09:42:47 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.16 scrub ok
Dec 06 09:42:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:47 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200038d0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e69 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:42:48 compute-1 ceph-mon[79770]: pgmap v72: 337 pgs: 8 remapped+peering, 16 peering, 1 active+recovering, 312 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 3.9 KiB/s rd, 3 op/s; 1/226 objects misplaced (0.442%); 757 B/s, 2 keys/s, 24 objects/s recovering
Dec 06 09:42:48 compute-1 ceph-mon[79770]: 10.11 scrub starts
Dec 06 09:42:48 compute-1 ceph-mon[79770]: 10.11 scrub ok
Dec 06 09:42:48 compute-1 ceph-mon[79770]: osdmap e69: 3 total, 3 up, 3 in
Dec 06 09:42:48 compute-1 ceph-mon[79770]: 10.16 scrub starts
Dec 06 09:42:48 compute-1 ceph-mon[79770]: 10.16 scrub ok
Dec 06 09:42:48 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.e scrub starts
Dec 06 09:42:48 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.e scrub ok
Dec 06 09:42:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:48 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc003c10 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:49 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:49 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.a deep-scrub starts
Dec 06 09:42:49 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.a deep-scrub ok
Dec 06 09:42:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:49 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4000b60 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:50 compute-1 ceph-mon[79770]: 10.13 scrub starts
Dec 06 09:42:50 compute-1 ceph-mon[79770]: 10.13 scrub ok
Dec 06 09:42:50 compute-1 ceph-mon[79770]: 10.e scrub starts
Dec 06 09:42:50 compute-1 ceph-mon[79770]: 10.e scrub ok
Dec 06 09:42:50 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:50 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:50 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:50 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:50 compute-1 ceph-mon[79770]: Regenerating cephadm self-signed grafana TLS certificates
Dec 06 09:42:50 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:50 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:50 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Dec 06 09:42:50 compute-1 ceph-mon[79770]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Dec 06 09:42:50 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:50 compute-1 ceph-mon[79770]: Deploying daemon grafana.compute-0 on compute-0
Dec 06 09:42:50 compute-1 ceph-mon[79770]: 7.1a scrub starts
Dec 06 09:42:50 compute-1 ceph-mon[79770]: 7.1a scrub ok
Dec 06 09:42:50 compute-1 ceph-mon[79770]: pgmap v74: 337 pgs: 8 remapped+peering, 16 peering, 1 active+recovering, 312 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 1/226 objects misplaced (0.442%)
Dec 06 09:42:50 compute-1 ceph-mon[79770]: 8.15 scrub starts
Dec 06 09:42:50 compute-1 ceph-mon[79770]: 8.15 scrub ok
Dec 06 09:42:50 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.2 scrub starts
Dec 06 09:42:50 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.2 scrub ok
Dec 06 09:42:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:50 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb24002010 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:51 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:51 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e70 e70: 3 total, 3 up, 3 in
Dec 06 09:42:51 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1 scrub starts
Dec 06 09:42:51 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1 scrub ok
Dec 06 09:42:51 compute-1 ceph-mon[79770]: 10.a deep-scrub starts
Dec 06 09:42:51 compute-1 ceph-mon[79770]: 10.a deep-scrub ok
Dec 06 09:42:51 compute-1 ceph-mon[79770]: 8.8 scrub starts
Dec 06 09:42:51 compute-1 ceph-mon[79770]: 8.8 scrub ok
Dec 06 09:42:51 compute-1 ceph-mon[79770]: 8.f scrub starts
Dec 06 09:42:51 compute-1 ceph-mon[79770]: 8.f scrub ok
Dec 06 09:42:51 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:51 compute-1 ceph-mon[79770]: 10.2 scrub starts
Dec 06 09:42:51 compute-1 ceph-mon[79770]: 10.2 scrub ok
Dec 06 09:42:51 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]: dispatch
Dec 06 09:42:51 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Dec 06 09:42:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:51 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:52 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.0 scrub starts
Dec 06 09:42:52 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.0 scrub ok
Dec 06 09:42:52 compute-1 ceph-mon[79770]: 9.11 scrub starts
Dec 06 09:42:52 compute-1 ceph-mon[79770]: 9.11 scrub ok
Dec 06 09:42:52 compute-1 ceph-mon[79770]: pgmap v75: 337 pgs: 337 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 1 objects/s recovering
Dec 06 09:42:52 compute-1 ceph-mon[79770]: 9.1d scrub starts
Dec 06 09:42:52 compute-1 ceph-mon[79770]: 9.1d scrub ok
Dec 06 09:42:52 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "5"}]': finished
Dec 06 09:42:52 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Dec 06 09:42:52 compute-1 ceph-mon[79770]: osdmap e70: 3 total, 3 up, 3 in
Dec 06 09:42:52 compute-1 ceph-mon[79770]: 8.1 scrub starts
Dec 06 09:42:52 compute-1 ceph-mon[79770]: 8.1 scrub ok
Dec 06 09:42:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:52 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e70 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:42:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:53 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb24002010 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:53 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.7 scrub starts
Dec 06 09:42:53 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.7 scrub ok
Dec 06 09:42:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e71 e71: 3 total, 3 up, 3 in
Dec 06 09:42:53 compute-1 ceph-mon[79770]: 8.14 scrub starts
Dec 06 09:42:53 compute-1 ceph-mon[79770]: 8.14 scrub ok
Dec 06 09:42:53 compute-1 ceph-mon[79770]: 9.13 deep-scrub starts
Dec 06 09:42:53 compute-1 ceph-mon[79770]: 9.13 deep-scrub ok
Dec 06 09:42:53 compute-1 ceph-mon[79770]: 8.0 scrub starts
Dec 06 09:42:53 compute-1 ceph-mon[79770]: 8.0 scrub ok
Dec 06 09:42:53 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]: dispatch
Dec 06 09:42:53 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Dec 06 09:42:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:53 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:54 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1a scrub starts
Dec 06 09:42:54 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1a scrub ok
Dec 06 09:42:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:54 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf40016a0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:55 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e72 e72: 3 total, 3 up, 3 in
Dec 06 09:42:55 compute-1 ceph-mon[79770]: 9.12 scrub starts
Dec 06 09:42:55 compute-1 ceph-mon[79770]: 9.12 scrub ok
Dec 06 09:42:55 compute-1 ceph-mon[79770]: pgmap v77: 337 pgs: 337 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Dec 06 09:42:55 compute-1 ceph-mon[79770]: 8.c scrub starts
Dec 06 09:42:55 compute-1 ceph-mon[79770]: 8.c scrub ok
Dec 06 09:42:55 compute-1 ceph-mon[79770]: 8.7 scrub starts
Dec 06 09:42:55 compute-1 ceph-mon[79770]: 8.7 scrub ok
Dec 06 09:42:55 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "6"}]': finished
Dec 06 09:42:55 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Dec 06 09:42:55 compute-1 ceph-mon[79770]: osdmap e71: 3 total, 3 up, 3 in
Dec 06 09:42:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:55 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:55 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1a scrub starts
Dec 06 09:42:55 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1a scrub ok
Dec 06 09:42:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:55 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb24002010 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:56 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e73 e73: 3 total, 3 up, 3 in
Dec 06 09:42:56 compute-1 ceph-mon[79770]: 9.d scrub starts
Dec 06 09:42:56 compute-1 ceph-mon[79770]: 9.d scrub ok
Dec 06 09:42:56 compute-1 ceph-mon[79770]: 8.1c scrub starts
Dec 06 09:42:56 compute-1 ceph-mon[79770]: 8.1c scrub ok
Dec 06 09:42:56 compute-1 ceph-mon[79770]: 9.1a scrub starts
Dec 06 09:42:56 compute-1 ceph-mon[79770]: 9.1a scrub ok
Dec 06 09:42:56 compute-1 ceph-mon[79770]: 9.15 deep-scrub starts
Dec 06 09:42:56 compute-1 ceph-mon[79770]: pgmap v79: 337 pgs: 4 unknown, 2 peering, 331 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Dec 06 09:42:56 compute-1 ceph-mon[79770]: 9.15 deep-scrub ok
Dec 06 09:42:56 compute-1 ceph-mon[79770]: 9.9 deep-scrub starts
Dec 06 09:42:56 compute-1 ceph-mon[79770]: 9.9 deep-scrub ok
Dec 06 09:42:56 compute-1 ceph-mon[79770]: osdmap e72: 3 total, 3 up, 3 in
Dec 06 09:42:56 compute-1 ceph-mon[79770]: 8.1a scrub starts
Dec 06 09:42:56 compute-1 ceph-mon[79770]: 8.1a scrub ok
Dec 06 09:42:56 compute-1 ceph-mon[79770]: osdmap e73: 3 total, 3 up, 3 in
Dec 06 09:42:56 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1b scrub starts
Dec 06 09:42:56 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1b scrub ok
Dec 06 09:42:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:56 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e74 e74: 3 total, 3 up, 3 in
Dec 06 09:42:57 compute-1 ceph-mon[79770]: 8.3 scrub starts
Dec 06 09:42:57 compute-1 ceph-mon[79770]: 9.e scrub starts
Dec 06 09:42:57 compute-1 ceph-mon[79770]: 8.3 scrub ok
Dec 06 09:42:57 compute-1 ceph-mon[79770]: 9.e scrub ok
Dec 06 09:42:57 compute-1 ceph-mon[79770]: 9.1b scrub starts
Dec 06 09:42:57 compute-1 ceph-mon[79770]: 9.1b scrub ok
Dec 06 09:42:57 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:57 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:57 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:57 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:57 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:42:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:57 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:57 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1e deep-scrub starts
Dec 06 09:42:57 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1e deep-scrub ok
Dec 06 09:42:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:57 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e74 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:42:58 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1f scrub starts
Dec 06 09:42:58 compute-1 ceph-mon[79770]: Deploying daemon haproxy.rgw.default.compute-0.vhqyer on compute-0
Dec 06 09:42:58 compute-1 ceph-mon[79770]: pgmap v82: 337 pgs: 4 unknown, 2 peering, 331 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:42:58 compute-1 ceph-mon[79770]: 8.19 scrub starts
Dec 06 09:42:58 compute-1 ceph-mon[79770]: 8.19 scrub ok
Dec 06 09:42:58 compute-1 ceph-mon[79770]: 7.1d scrub starts
Dec 06 09:42:58 compute-1 ceph-mon[79770]: 7.1d scrub ok
Dec 06 09:42:58 compute-1 ceph-mon[79770]: osdmap e74: 3 total, 3 up, 3 in
Dec 06 09:42:58 compute-1 ceph-mon[79770]: 8.1e deep-scrub starts
Dec 06 09:42:58 compute-1 ceph-mon[79770]: 8.1e deep-scrub ok
Dec 06 09:42:58 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1f scrub ok
Dec 06 09:42:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e75 e75: 3 total, 3 up, 3 in
Dec 06 09:42:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:58 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb24002010 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:59 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1d scrub starts
Dec 06 09:42:59 compute-1 ceph-mon[79770]: 9.a scrub starts
Dec 06 09:42:59 compute-1 ceph-mon[79770]: 9.a scrub ok
Dec 06 09:42:59 compute-1 ceph-mon[79770]: 7.1f scrub starts
Dec 06 09:42:59 compute-1 ceph-mon[79770]: 7.1f scrub ok
Dec 06 09:42:59 compute-1 ceph-mon[79770]: 9.1f scrub starts
Dec 06 09:42:59 compute-1 ceph-mon[79770]: 9.1f scrub ok
Dec 06 09:42:59 compute-1 ceph-mon[79770]: osdmap e75: 3 total, 3 up, 3 in
Dec 06 09:42:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:59 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:59 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.1d scrub ok
Dec 06 09:42:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:42:59 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:42:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:42:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.004000096s ======
Dec 06 09:42:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:42:59.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.004000096s
Dec 06 09:43:00 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1c scrub starts
Dec 06 09:43:00 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1c scrub ok
Dec 06 09:43:00 compute-1 ceph-mon[79770]: pgmap v85: 337 pgs: 4 unknown, 2 peering, 331 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:43:00 compute-1 ceph-mon[79770]: 8.12 scrub starts
Dec 06 09:43:00 compute-1 ceph-mon[79770]: 8.12 scrub ok
Dec 06 09:43:00 compute-1 ceph-mon[79770]: 8.9 scrub starts
Dec 06 09:43:00 compute-1 ceph-mon[79770]: 8.9 scrub ok
Dec 06 09:43:00 compute-1 ceph-mon[79770]: 8.1d scrub starts
Dec 06 09:43:00 compute-1 ceph-mon[79770]: 8.1d scrub ok
Dec 06 09:43:00 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:00 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:00 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:00 compute-1 ceph-mon[79770]: Deploying daemon haproxy.rgw.default.compute-2.mwbfro on compute-2
Dec 06 09:43:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:00 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:01 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb240091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:01 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.13 scrub starts
Dec 06 09:43:01 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 8.13 scrub ok
Dec 06 09:43:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:01 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:43:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:01.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:43:02 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.4 scrub starts
Dec 06 09:43:02 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.4 scrub ok
Dec 06 09:43:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:02 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4001fc0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e75 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:43:02 compute-1 ceph-mon[79770]: 9.b scrub starts
Dec 06 09:43:02 compute-1 ceph-mon[79770]: 8.4 deep-scrub starts
Dec 06 09:43:02 compute-1 ceph-mon[79770]: 9.b scrub ok
Dec 06 09:43:02 compute-1 ceph-mon[79770]: 8.4 deep-scrub ok
Dec 06 09:43:02 compute-1 ceph-mon[79770]: 9.1c scrub starts
Dec 06 09:43:02 compute-1 ceph-mon[79770]: 9.1c scrub ok
Dec 06 09:43:02 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:02 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]: dispatch
Dec 06 09:43:02 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Dec 06 09:43:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e76 e76: 3 total, 3 up, 3 in
Dec 06 09:43:03 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 76 pg[10.d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76) [0] r=0 lpr=76 pi=[65,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:03 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 76 pg[10.1d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76) [0] r=0 lpr=76 pi=[65,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:03 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 76 pg[10.5( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76) [0] r=0 lpr=76 pi=[65,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:03 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 76 pg[10.15( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=76) [0] r=0 lpr=76 pi=[65,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:03 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.19 scrub starts
Dec 06 09:43:03 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.19 scrub ok
Dec 06 09:43:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:43:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:03.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:43:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:03 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb240091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:43:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:03.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:43:04 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1e scrub starts
Dec 06 09:43:04 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 9.1e scrub ok
Dec 06 09:43:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:04 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:05 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.1b scrub starts
Dec 06 09:43:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:05 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:05 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.1b scrub ok
Dec 06 09:43:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:43:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:05.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:43:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:05 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 76 pg[6.e( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=62/62 les/c/f=63/63/0 sis=76) [0] r=0 lpr=76 pi=[62,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 76 pg[6.6( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=62/62 les/c/f=63/63/0 sis=76) [0] r=0 lpr=76 pi=[62,76)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:05 compute-1 ceph-mon[79770]: pgmap v86: 337 pgs: 337 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 0 B/s wr, 45 op/s; 106 B/s, 5 objects/s recovering
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 8.11 scrub starts
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 8.11 scrub ok
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 8.17 scrub starts
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 8.17 scrub ok
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 8.13 scrub starts
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 8.13 scrub ok
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 9.10 scrub starts
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 8.a scrub starts
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 9.4 scrub starts
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 9.4 scrub ok
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 9.10 scrub ok
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 8.a scrub ok
Dec 06 09:43:05 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "7"}]': finished
Dec 06 09:43:05 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Dec 06 09:43:05 compute-1 ceph-mon[79770]: osdmap e76: 3 total, 3 up, 3 in
Dec 06 09:43:05 compute-1 ceph-mon[79770]: pgmap v88: 337 pgs: 337 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 0 B/s wr, 44 op/s; 104 B/s, 5 objects/s recovering
Dec 06 09:43:05 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]: dispatch
Dec 06 09:43:05 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 7.5 deep-scrub starts
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 7.5 deep-scrub ok
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 8.10 scrub starts
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 8.10 scrub ok
Dec 06 09:43:05 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:05 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:05 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:05 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec 06 09:43:05 compute-1 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec 06 09:43:05 compute-1 ceph-mon[79770]: Deploying daemon keepalived.rgw.default.compute-0.mycoxk on compute-0
Dec 06 09:43:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:05.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:05 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e77 e77: 3 total, 3 up, 3 in
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.425968170s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 active pruub 223.407394409s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.425865173s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 223.407394409s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.427437782s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 active pruub 223.410537720s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.427408218s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 223.410537720s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.5( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.1d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.426686287s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 active pruub 223.410598755s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.1d( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.426582336s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 223.410598755s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.426499367s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 active pruub 223.410690308s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=77 pruub=13.426470757s) [1] r=-1 lpr=77 pi=[68,77)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 223.410690308s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.15( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.15( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[10.5( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=77) [0]/[2] r=-1 lpr=77 pi=[65,77)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[6.e( v 50'39 lc 48'19 (0'0,50'39] local-lis/les=76/77 n=1 ec=54/21 lis/c=62/62 les/c/f=63/63/0 sis=76) [0] r=0 lpr=76 pi=[62,76)/1 crt=50'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 77 pg[6.6( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=76/77 n=1 ec=54/21 lis/c=62/62 les/c/f=63/63/0 sis=76) [0] r=0 lpr=76 pi=[62,76)/1 crt=50'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:06 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Dec 06 09:43:06 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Dec 06 09:43:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:06 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb240091b0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 9.19 scrub starts
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 9.19 scrub ok
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 8.1f scrub starts
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 8.1f scrub ok
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 9.f scrub starts
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 9.1e scrub starts
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 9.1e scrub ok
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 8.5 scrub starts
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 8.5 scrub ok
Dec 06 09:43:06 compute-1 ceph-mon[79770]: pgmap v89: 337 pgs: 4 unknown, 333 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 4 B/s, 0 objects/s recovering
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 7.1b scrub starts
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 7.1b scrub ok
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 8.18 scrub starts
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 9.f scrub ok
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 8.18 scrub ok
Dec 06 09:43:06 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "8"}]': finished
Dec 06 09:43:06 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Dec 06 09:43:06 compute-1 ceph-mon[79770]: osdmap e77: 3 total, 3 up, 3 in
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 8.b scrub starts
Dec 06 09:43:06 compute-1 ceph-mon[79770]: 8.b scrub ok
Dec 06 09:43:06 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e78 e78: 3 total, 3 up, 3 in
Dec 06 09:43:06 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:06 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:06 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:06 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=6 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:06 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:06 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:06 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:06 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 78 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:07 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.2 scrub starts
Dec 06 09:43:07 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.2 scrub ok
Dec 06 09:43:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:07 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf40032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:43:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:07.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:43:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:07 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:07 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e78 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:43:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:07.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:07 compute-1 ceph-mon[79770]: 7.6 scrub starts
Dec 06 09:43:07 compute-1 ceph-mon[79770]: 7.6 scrub ok
Dec 06 09:43:07 compute-1 ceph-mon[79770]: 8.1b scrub starts
Dec 06 09:43:07 compute-1 ceph-mon[79770]: 8.1b scrub ok
Dec 06 09:43:07 compute-1 ceph-mon[79770]: osdmap e78: 3 total, 3 up, 3 in
Dec 06 09:43:07 compute-1 ceph-mon[79770]: 7.11 scrub starts
Dec 06 09:43:07 compute-1 ceph-mon[79770]: 7.11 scrub ok
Dec 06 09:43:07 compute-1 ceph-mon[79770]: pgmap v92: 337 pgs: 2 active+clean+scrubbing, 4 unknown, 331 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:43:07 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:07 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:07 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:07 compute-1 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Dec 06 09:43:07 compute-1 ceph-mon[79770]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Dec 06 09:43:07 compute-1 ceph-mon[79770]: Deploying daemon keepalived.rgw.default.compute-2.yurwwh on compute-2
Dec 06 09:43:07 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e79 e79: 3 total, 3 up, 3 in
Dec 06 09:43:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.5( v 78'1042 (0'0,78'1042] local-lis/les=0/0 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 luod=0'0 crt=66'1039 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.5( v 78'1042 (0'0,78'1042] local-lis/les=0/0 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=66'1039 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] async=[1] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=6 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] async=[1] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=6}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=5 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] async=[1] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 79 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=78) [1]/[0] async=[1] r=0 lpr=78 pi=[68,78)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:08 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.3 deep-scrub starts
Dec 06 09:43:08 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.3 deep-scrub ok
Dec 06 09:43:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:08 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e80 e80: 3 total, 3 up, 3 in
Dec 06 09:43:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=4 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=15.006292343s) [1] async=[1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 51'1027 active pruub 228.041992188s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=5 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=15.000350952s) [1] async=[1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 51'1027 active pruub 228.036636353s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.e( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=5 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=15.000283241s) [1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 228.036636353s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.16( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=4 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=15.005999565s) [1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 228.041992188s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=6 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.998921394s) [1] async=[1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 51'1027 active pruub 228.036636353s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:08 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.6( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=6 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.998816490s) [1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 228.036636353s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:09 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=5 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.998106003s) [1] async=[1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 51'1027 active pruub 228.036529541s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:09 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.1e( v 51'1027 (0'0,51'1027] local-lis/les=78/79 n=5 ec=58/45 lis/c=78/68 les/c/f=79/69/0 sis=80 pruub=14.997945786s) [1] r=-1 lpr=80 pi=[68,80)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 228.036529541s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:09 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:09 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.5( v 78'1042 (0'0,78'1042] local-lis/les=79/80 n=6 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=78'1042 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:09 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.15( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:09 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 80 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=5 ec=58/45 lis/c=77/65 les/c/f=78/66/0 sis=79) [0] r=0 lpr=79 pi=[65,79)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:09 compute-1 ceph-mon[79770]: 7.2 scrub starts
Dec 06 09:43:09 compute-1 ceph-mon[79770]: 7.2 scrub ok
Dec 06 09:43:09 compute-1 ceph-mon[79770]: 9.6 scrub starts
Dec 06 09:43:09 compute-1 ceph-mon[79770]: 9.6 scrub ok
Dec 06 09:43:09 compute-1 ceph-mon[79770]: osdmap e79: 3 total, 3 up, 3 in
Dec 06 09:43:09 compute-1 ceph-mon[79770]: 7.a scrub starts
Dec 06 09:43:09 compute-1 ceph-mon[79770]: 7.a scrub ok
Dec 06 09:43:09 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.4 scrub starts
Dec 06 09:43:09 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.4 scrub ok
Dec 06 09:43:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:09 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:09 compute-1 sshd-session[85349]: Accepted publickey for zuul from 192.168.122.30 port 59182 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:43:09 compute-1 systemd-logind[788]: New session 36 of user zuul.
Dec 06 09:43:09 compute-1 systemd[1]: Started Session 36 of User zuul.
Dec 06 09:43:09 compute-1 sshd-session[85349]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:43:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:43:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:09.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:43:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:09 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf40032f0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:09.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:10 compute-1 ceph-mon[79770]: 7.3 deep-scrub starts
Dec 06 09:43:10 compute-1 ceph-mon[79770]: 7.3 deep-scrub ok
Dec 06 09:43:10 compute-1 ceph-mon[79770]: 12.15 scrub starts
Dec 06 09:43:10 compute-1 ceph-mon[79770]: 12.15 scrub ok
Dec 06 09:43:10 compute-1 ceph-mon[79770]: osdmap e80: 3 total, 3 up, 3 in
Dec 06 09:43:10 compute-1 ceph-mon[79770]: pgmap v95: 337 pgs: 2 active+clean+scrubbing, 4 unknown, 331 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:43:10 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:10 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:10 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:10 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:10 compute-1 ceph-mon[79770]: 8.6 scrub starts
Dec 06 09:43:10 compute-1 ceph-mon[79770]: 8.6 scrub ok
Dec 06 09:43:10 compute-1 ceph-mon[79770]: Deploying daemon prometheus.compute-0 on compute-0
Dec 06 09:43:10 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e81 e81: 3 total, 3 up, 3 in
Dec 06 09:43:10 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.e scrub starts
Dec 06 09:43:10 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.e scrub ok
Dec 06 09:43:10 compute-1 python3.9[85503]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:43:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:10 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:11 compute-1 ceph-mon[79770]: 7.4 scrub starts
Dec 06 09:43:11 compute-1 ceph-mon[79770]: 7.4 scrub ok
Dec 06 09:43:11 compute-1 ceph-mon[79770]: 12.d scrub starts
Dec 06 09:43:11 compute-1 ceph-mon[79770]: 12.d scrub ok
Dec 06 09:43:11 compute-1 ceph-mon[79770]: 9.8 deep-scrub starts
Dec 06 09:43:11 compute-1 ceph-mon[79770]: 9.8 deep-scrub ok
Dec 06 09:43:11 compute-1 ceph-mon[79770]: osdmap e81: 3 total, 3 up, 3 in
Dec 06 09:43:11 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:11 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]: dispatch
Dec 06 09:43:11 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Dec 06 09:43:11 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.f scrub starts
Dec 06 09:43:11 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.f scrub ok
Dec 06 09:43:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:11 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:11 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e82 e82: 3 total, 3 up, 3 in
Dec 06 09:43:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 82 pg[6.8( v 50'39 (0'0,50'39] local-lis/les=54/55 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=82 pruub=9.370536804s) [1] r=-1 lpr=82 pi=[54,82)/1 crt=50'39 lcod 0'0 mlcod 0'0 active pruub 224.791793823s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 82 pg[6.8( v 50'39 (0'0,50'39] local-lis/les=54/55 n=0 ec=54/21 lis/c=54/54 les/c/f=55/55/0 sis=82 pruub=9.370141029s) [1] r=-1 lpr=82 pi=[54,82)/1 crt=50'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 224.791793823s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:43:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:11.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:43:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:11 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:43:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:11.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:43:12 compute-1 ceph-mon[79770]: 7.e scrub starts
Dec 06 09:43:12 compute-1 ceph-mon[79770]: 7.e scrub ok
Dec 06 09:43:12 compute-1 ceph-mon[79770]: 12.5 scrub starts
Dec 06 09:43:12 compute-1 ceph-mon[79770]: 12.5 scrub ok
Dec 06 09:43:12 compute-1 ceph-mon[79770]: pgmap v97: 337 pgs: 337 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 34 KiB/s rd, 0 B/s wr, 66 op/s; 312 B/s, 16 objects/s recovering
Dec 06 09:43:12 compute-1 ceph-mon[79770]: 9.5 scrub starts
Dec 06 09:43:12 compute-1 ceph-mon[79770]: 9.5 scrub ok
Dec 06 09:43:12 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "9"}]': finished
Dec 06 09:43:12 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Dec 06 09:43:12 compute-1 ceph-mon[79770]: osdmap e82: 3 total, 3 up, 3 in
Dec 06 09:43:12 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Dec 06 09:43:12 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Dec 06 09:43:12 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e83 e83: 3 total, 3 up, 3 in
Dec 06 09:43:12 compute-1 sudo[85716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynvrzlgtljhrczszuifnafbkkkhmrsmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014191.9489942-58-276318310111033/AnsiballZ_command.py'
Dec 06 09:43:12 compute-1 sudo[85716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:12 compute-1 python3.9[85718]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:43:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:12 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:12 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:43:13 compute-1 ceph-mon[79770]: 7.f scrub starts
Dec 06 09:43:13 compute-1 ceph-mon[79770]: 7.f scrub ok
Dec 06 09:43:13 compute-1 ceph-mon[79770]: 9.3 scrub starts
Dec 06 09:43:13 compute-1 ceph-mon[79770]: 9.3 scrub ok
Dec 06 09:43:13 compute-1 ceph-mon[79770]: 12.0 scrub starts
Dec 06 09:43:13 compute-1 ceph-mon[79770]: 12.0 scrub ok
Dec 06 09:43:13 compute-1 ceph-mon[79770]: 7.8 scrub starts
Dec 06 09:43:13 compute-1 ceph-mon[79770]: 7.8 scrub ok
Dec 06 09:43:13 compute-1 ceph-mon[79770]: osdmap e83: 3 total, 3 up, 3 in
Dec 06 09:43:13 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]: dispatch
Dec 06 09:43:13 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Dec 06 09:43:13 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.b scrub starts
Dec 06 09:43:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:13 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:13 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.b scrub ok
Dec 06 09:43:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e84 e84: 3 total, 3 up, 3 in
Dec 06 09:43:13 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 84 pg[6.9( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=58/58 les/c/f=59/59/0 sis=84) [0] r=0 lpr=84 pi=[58,84)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:13.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:13 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:43:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:13.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:43:14 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 84 pg[10.18( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=84) [0] r=0 lpr=84 pi=[58,84)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:14 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 84 pg[10.8( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=84) [0] r=0 lpr=84 pi=[58,84)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:14 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.13 scrub starts
Dec 06 09:43:14 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.13 scrub ok
Dec 06 09:43:14 compute-1 ceph-mon[79770]: 12.1f scrub starts
Dec 06 09:43:14 compute-1 ceph-mon[79770]: 12.1f scrub ok
Dec 06 09:43:14 compute-1 ceph-mon[79770]: 9.17 scrub starts
Dec 06 09:43:14 compute-1 ceph-mon[79770]: pgmap v100: 337 pgs: 337 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 35 KiB/s rd, 0 B/s wr, 66 op/s; 315 B/s, 16 objects/s recovering
Dec 06 09:43:14 compute-1 ceph-mon[79770]: 9.17 scrub ok
Dec 06 09:43:14 compute-1 ceph-mon[79770]: 7.b scrub starts
Dec 06 09:43:14 compute-1 ceph-mon[79770]: 7.b scrub ok
Dec 06 09:43:14 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "10"}]': finished
Dec 06 09:43:14 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Dec 06 09:43:14 compute-1 ceph-mon[79770]: osdmap e84: 3 total, 3 up, 3 in
Dec 06 09:43:14 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e85 e85: 3 total, 3 up, 3 in
Dec 06 09:43:14 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 85 pg[10.18( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=85) [0]/[1] r=-1 lpr=85 pi=[58,85)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:14 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 85 pg[10.18( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=85) [0]/[1] r=-1 lpr=85 pi=[58,85)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:14 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 85 pg[10.8( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=85) [0]/[1] r=-1 lpr=85 pi=[58,85)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:14 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 85 pg[10.8( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=58/58 les/c/f=59/59/0 sis=85) [0]/[1] r=-1 lpr=85 pi=[58,85)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:14 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 85 pg[6.9( v 50'39 (0'0,50'39] local-lis/les=84/85 n=0 ec=54/21 lis/c=58/58 les/c/f=59/59/0 sis=84) [0] r=0 lpr=84 pi=[58,84)/1 crt=50'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.515759) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194516128, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7353, "num_deletes": 255, "total_data_size": 21105504, "memory_usage": 21943440, "flush_reason": "Manual Compaction"}
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194616079, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 13003679, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 249, "largest_seqno": 7358, "table_properties": {"data_size": 12975211, "index_size": 18113, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9285, "raw_key_size": 90236, "raw_average_key_size": 24, "raw_value_size": 12903944, "raw_average_value_size": 3480, "num_data_blocks": 803, "num_entries": 3707, "num_filter_entries": 3707, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 1765014012, "file_creation_time": 1765014194, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 100373 microseconds, and 49877 cpu microseconds.
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.616249) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 13003679 bytes OK
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.616287) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.618940) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.618979) EVENT_LOG_v1 {"time_micros": 1765014194618972, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.619004) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 21065478, prev total WAL file size 21065478, number of live WAL files 2.
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.623606) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(12MB) 8(1648B)]
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194624103, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 13005327, "oldest_snapshot_seqno": -1}
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3456 keys, 13000182 bytes, temperature: kUnknown
Dec 06 09:43:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:14 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4004000 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194746384, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 13000182, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12972411, "index_size": 18061, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8645, "raw_key_size": 86058, "raw_average_key_size": 24, "raw_value_size": 12904224, "raw_average_value_size": 3733, "num_data_blocks": 801, "num_entries": 3456, "num_filter_entries": 3456, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765014194, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.746667) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 13000182 bytes
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.747960) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 106.4 rd, 106.4 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(12.4, 0.0 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3712, records dropped: 256 output_compression: NoCompression
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.748125) EVENT_LOG_v1 {"time_micros": 1765014194748111, "job": 4, "event": "compaction_finished", "compaction_time_micros": 122216, "compaction_time_cpu_micros": 73133, "output_level": 6, "num_output_files": 1, "total_output_size": 13000182, "num_input_records": 3712, "num_output_records": 3456, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194750354, "job": 4, "event": "table_file_deletion", "file_number": 14}
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014194750495, "job": 4, "event": "table_file_deletion", "file_number": 8}
Dec 06 09:43:14 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:14.623290) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:43:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:15 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:15 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Dec 06 09:43:15 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Dec 06 09:43:15 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e86 e86: 3 total, 3 up, 3 in
Dec 06 09:43:15 compute-1 ceph-mon[79770]: 7.14 scrub starts
Dec 06 09:43:15 compute-1 ceph-mon[79770]: 7.14 scrub ok
Dec 06 09:43:15 compute-1 ceph-mon[79770]: 12.1b deep-scrub starts
Dec 06 09:43:15 compute-1 ceph-mon[79770]: 12.1b deep-scrub ok
Dec 06 09:43:15 compute-1 ceph-mon[79770]: 7.13 scrub starts
Dec 06 09:43:15 compute-1 ceph-mon[79770]: 7.13 scrub ok
Dec 06 09:43:15 compute-1 ceph-mon[79770]: osdmap e85: 3 total, 3 up, 3 in
Dec 06 09:43:15 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]: dispatch
Dec 06 09:43:15 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Dec 06 09:43:15 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:15 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:15 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:15 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Dec 06 09:43:15 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:43:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:15.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:43:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:15 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:15.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:15 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e87 e87: 3 total, 3 up, 3 in
Dec 06 09:43:15 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 87 pg[10.8( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=85/58 les/c/f=86/59/0 sis=87) [0] r=0 lpr=87 pi=[58,87)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:15 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 87 pg[10.8( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=6 ec=58/45 lis/c=85/58 les/c/f=86/59/0 sis=87) [0] r=0 lpr=87 pi=[58,87)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:15 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 87 pg[10.18( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=85/58 les/c/f=86/59/0 sis=87) [0] r=0 lpr=87 pi=[58,87)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:15 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 87 pg[10.18( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=85/58 les/c/f=86/59/0 sis=87) [0] r=0 lpr=87 pi=[58,87)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.010288) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196010404, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 342, "num_deletes": 253, "total_data_size": 249897, "memory_usage": 257848, "flush_reason": "Manual Compaction"}
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196013436, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 165706, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7363, "largest_seqno": 7700, "table_properties": {"data_size": 163515, "index_size": 355, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 4664, "raw_average_key_size": 15, "raw_value_size": 159093, "raw_average_value_size": 526, "num_data_blocks": 16, "num_entries": 302, "num_filter_entries": 302, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014195, "oldest_key_time": 1765014195, "file_creation_time": 1765014196, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 3186 microseconds, and 1332 cpu microseconds.
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.013476) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 165706 bytes OK
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.013495) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.014684) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.014757) EVENT_LOG_v1 {"time_micros": 1765014196014751, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.014774) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 247490, prev total WAL file size 247490, number of live WAL files 2.
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.015309) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323534' seq:0, type:0; will stop at (end)
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(161KB)], [15(12MB)]
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196015436, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 13165888, "oldest_snapshot_seqno": -1}
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3236 keys, 12743476 bytes, temperature: kUnknown
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196096444, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 12743476, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12717033, "index_size": 17245, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8133, "raw_key_size": 83338, "raw_average_key_size": 25, "raw_value_size": 12652523, "raw_average_value_size": 3909, "num_data_blocks": 748, "num_entries": 3236, "num_filter_entries": 3236, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765014196, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.096802) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 12743476 bytes
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.098983) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 162.3 rd, 157.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 12.4 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(156.4) write-amplify(76.9) OK, records in: 3758, records dropped: 522 output_compression: NoCompression
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.099008) EVENT_LOG_v1 {"time_micros": 1765014196098996, "job": 6, "event": "compaction_finished", "compaction_time_micros": 81130, "compaction_time_cpu_micros": 27291, "output_level": 6, "num_output_files": 1, "total_output_size": 12743476, "num_input_records": 3758, "num_output_records": 3236, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196099415, "job": 6, "event": "table_file_deletion", "file_number": 17}
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014196101250, "job": 6, "event": "table_file_deletion", "file_number": 15}
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.015219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.101415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.101424) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.101429) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.101432) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:43:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:43:16.101433) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr handle_mgr_map respawning because set of enabled modules changed!
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr respawn  e: '/usr/bin/ceph-mgr'
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr respawn  0: '/usr/bin/ceph-mgr'
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr respawn  1: '-n'
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr respawn  2: 'mgr.compute-1.sauzid'
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr respawn  3: '-f'
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr respawn  4: '--setuser'
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr respawn  5: 'ceph'
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr respawn  6: '--setgroup'
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr respawn  7: 'ceph'
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr respawn  8: '--default-log-to-file=false'
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr respawn  9: '--default-log-to-journald=true'
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr respawn  10: '--default-log-to-stderr=false'
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr respawn  exe_path /proc/self/exe
Dec 06 09:43:16 compute-1 sshd-session[81519]: Connection closed by 192.168.122.100 port 59718
Dec 06 09:43:16 compute-1 sshd-session[81500]: pam_unix(sshd:session): session closed for user ceph-admin
Dec 06 09:43:16 compute-1 systemd[1]: session-34.scope: Deactivated successfully.
Dec 06 09:43:16 compute-1 systemd[1]: session-34.scope: Consumed 26.960s CPU time.
Dec 06 09:43:16 compute-1 systemd-logind[788]: Session 34 logged out. Waiting for processes to exit.
Dec 06 09:43:16 compute-1 systemd-logind[788]: Removed session 34.
Dec 06 09:43:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setuser ceph since I am not root
Dec 06 09:43:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: ignoring --setgroup ceph since I am not root
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: pidfile_write: ignore empty --pid-file
Dec 06 09:43:16 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.1e scrub starts
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'alerts'
Dec 06 09:43:16 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.1e scrub ok
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 09:43:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:16.475+0000 7f263b68c140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'balancer'
Dec 06 09:43:16 compute-1 ceph-mon[79770]: pgmap v103: 337 pgs: 2 active+remapped, 335 active+clean; 458 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 109 B/s, 2 objects/s recovering
Dec 06 09:43:16 compute-1 ceph-mon[79770]: 9.16 scrub starts
Dec 06 09:43:16 compute-1 ceph-mon[79770]: 12.16 scrub starts
Dec 06 09:43:16 compute-1 ceph-mon[79770]: 9.16 scrub ok
Dec 06 09:43:16 compute-1 ceph-mon[79770]: 12.16 scrub ok
Dec 06 09:43:16 compute-1 ceph-mon[79770]: 7.9 scrub starts
Dec 06 09:43:16 compute-1 ceph-mon[79770]: 7.9 scrub ok
Dec 06 09:43:16 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "11"}]': finished
Dec 06 09:43:16 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Dec 06 09:43:16 compute-1 ceph-mon[79770]: osdmap e86: 3 total, 3 up, 3 in
Dec 06 09:43:16 compute-1 ceph-mon[79770]: osdmap e87: 3 total, 3 up, 3 in
Dec 06 09:43:16 compute-1 ceph-mon[79770]: from='mgr.14400 192.168.122.100:0/3311628268' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Dec 06 09:43:16 compute-1 ceph-mon[79770]: mgrmap e30: compute-0.qhdjwa(active, since 107s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 09:43:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:16.577+0000 7f263b68c140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Dec 06 09:43:16 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'cephadm'
Dec 06 09:43:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:16 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4004000 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:17 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feaf4004000 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:17 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e88 e88: 3 total, 3 up, 3 in
Dec 06 09:43:17 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 88 pg[10.8( v 51'1027 (0'0,51'1027] local-lis/les=87/88 n=6 ec=58/45 lis/c=85/58 les/c/f=86/59/0 sis=87) [0] r=0 lpr=87 pi=[58,87)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:17 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 88 pg[10.18( v 51'1027 (0'0,51'1027] local-lis/les=87/88 n=5 ec=58/45 lis/c=85/58 les/c/f=86/59/0 sis=87) [0] r=0 lpr=87 pi=[58,87)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:17 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.18 scrub starts
Dec 06 09:43:17 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.18 scrub ok
Dec 06 09:43:17 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'crash'
Dec 06 09:43:17 compute-1 ceph-mon[79770]: 12.14 scrub starts
Dec 06 09:43:17 compute-1 ceph-mon[79770]: 12.14 scrub ok
Dec 06 09:43:17 compute-1 ceph-mon[79770]: 9.7 scrub starts
Dec 06 09:43:17 compute-1 ceph-mon[79770]: 9.7 scrub ok
Dec 06 09:43:17 compute-1 ceph-mon[79770]: 7.1e scrub starts
Dec 06 09:43:17 compute-1 ceph-mon[79770]: 7.1e scrub ok
Dec 06 09:43:17 compute-1 ceph-mon[79770]: osdmap e88: 3 total, 3 up, 3 in
Dec 06 09:43:17 compute-1 ceph-mgr[80080]: mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 09:43:17 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'dashboard'
Dec 06 09:43:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:17.717+0000 7f263b68c140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Dec 06 09:43:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:43:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:17.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:43:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:17 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:17 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:43:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:17.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e89 e89: 3 total, 3 up, 3 in
Dec 06 09:43:18 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.10 scrub starts
Dec 06 09:43:18 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 7.10 scrub ok
Dec 06 09:43:18 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'devicehealth'
Dec 06 09:43:18 compute-1 ceph-mgr[80080]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 09:43:18 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'diskprediction_local'
Dec 06 09:43:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:18.472+0000 7f263b68c140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Dec 06 09:43:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Dec 06 09:43:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Dec 06 09:43:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]:   from numpy import show_config as show_numpy_config
Dec 06 09:43:18 compute-1 ceph-mon[79770]: 12.f scrub starts
Dec 06 09:43:18 compute-1 ceph-mon[79770]: 12.f scrub ok
Dec 06 09:43:18 compute-1 ceph-mon[79770]: 8.2 scrub starts
Dec 06 09:43:18 compute-1 ceph-mon[79770]: 8.2 scrub ok
Dec 06 09:43:18 compute-1 ceph-mon[79770]: 7.18 scrub starts
Dec 06 09:43:18 compute-1 ceph-mon[79770]: 7.18 scrub ok
Dec 06 09:43:18 compute-1 ceph-mon[79770]: osdmap e89: 3 total, 3 up, 3 in
Dec 06 09:43:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:18.731+0000 7f263b68c140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 09:43:18 compute-1 ceph-mgr[80080]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Dec 06 09:43:18 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'influx'
Dec 06 09:43:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:18 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:18.819+0000 7f263b68c140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 09:43:18 compute-1 ceph-mgr[80080]: mgr[py] Module influx has missing NOTIFY_TYPES member
Dec 06 09:43:18 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'insights'
Dec 06 09:43:18 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'iostat'
Dec 06 09:43:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:18.987+0000 7f263b68c140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 09:43:18 compute-1 ceph-mgr[80080]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Dec 06 09:43:18 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'k8sevents'
Dec 06 09:43:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:19 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:19 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.0 scrub starts
Dec 06 09:43:19 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.0 scrub ok
Dec 06 09:43:19 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'localpool'
Dec 06 09:43:19 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'mds_autoscaler'
Dec 06 09:43:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:19.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:19 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:19 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'mirroring'
Dec 06 09:43:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:43:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:19.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:43:19 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'nfs'
Dec 06 09:43:20 compute-1 sudo[85716]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:20.260+0000 7f263b68c140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 09:43:20 compute-1 ceph-mgr[80080]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Dec 06 09:43:20 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'orchestrator'
Dec 06 09:43:20 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.c scrub starts
Dec 06 09:43:20 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.c scrub ok
Dec 06 09:43:20 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e90 e90: 3 total, 3 up, 3 in
Dec 06 09:43:20 compute-1 ceph-mon[79770]: 12.1 scrub starts
Dec 06 09:43:20 compute-1 ceph-mon[79770]: 11.17 scrub starts
Dec 06 09:43:20 compute-1 ceph-mon[79770]: 12.1 scrub ok
Dec 06 09:43:20 compute-1 ceph-mon[79770]: 11.17 scrub ok
Dec 06 09:43:20 compute-1 ceph-mon[79770]: 7.10 scrub starts
Dec 06 09:43:20 compute-1 ceph-mon[79770]: 7.10 scrub ok
Dec 06 09:43:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:20.547+0000 7f263b68c140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 09:43:20 compute-1 ceph-mgr[80080]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Dec 06 09:43:20 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'osd_perf_query'
Dec 06 09:43:20 compute-1 sshd-session[85352]: Connection closed by 192.168.122.30 port 59182
Dec 06 09:43:20 compute-1 sshd-session[85349]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:43:20 compute-1 systemd[1]: session-36.scope: Deactivated successfully.
Dec 06 09:43:20 compute-1 systemd[1]: session-36.scope: Consumed 9.324s CPU time.
Dec 06 09:43:20 compute-1 systemd-logind[788]: Session 36 logged out. Waiting for processes to exit.
Dec 06 09:43:20 compute-1 systemd-logind[788]: Removed session 36.
Dec 06 09:43:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:20.634+0000 7f263b68c140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 09:43:20 compute-1 ceph-mgr[80080]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Dec 06 09:43:20 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'osd_support'
Dec 06 09:43:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:20.726+0000 7f263b68c140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 09:43:20 compute-1 ceph-mgr[80080]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Dec 06 09:43:20 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'pg_autoscaler'
Dec 06 09:43:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:20 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:20.812+0000 7f263b68c140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 09:43:20 compute-1 ceph-mgr[80080]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Dec 06 09:43:20 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'progress'
Dec 06 09:43:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:20.895+0000 7f263b68c140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 09:43:20 compute-1 ceph-mgr[80080]: mgr[py] Module progress has missing NOTIFY_TYPES member
Dec 06 09:43:20 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'prometheus'
Dec 06 09:43:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:21.276+0000 7f263b68c140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 09:43:21 compute-1 ceph-mgr[80080]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Dec 06 09:43:21 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'rbd_support'
Dec 06 09:43:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:21 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:21 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.b scrub starts
Dec 06 09:43:21 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.b scrub ok
Dec 06 09:43:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:21.391+0000 7f263b68c140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 09:43:21 compute-1 ceph-mgr[80080]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Dec 06 09:43:21 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'restful'
Dec 06 09:43:21 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'rgw'
Dec 06 09:43:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:43:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:21.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:43:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:21 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:21.910+0000 7f263b68c140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 09:43:21 compute-1 ceph-mgr[80080]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Dec 06 09:43:21 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'rook'
Dec 06 09:43:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:21.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:22 compute-1 ceph-mon[79770]: 11.13 scrub starts
Dec 06 09:43:22 compute-1 ceph-mon[79770]: 11.13 scrub ok
Dec 06 09:43:22 compute-1 ceph-mon[79770]: 11.1a scrub starts
Dec 06 09:43:22 compute-1 ceph-mon[79770]: 11.1a scrub ok
Dec 06 09:43:22 compute-1 ceph-mon[79770]: 11.0 scrub starts
Dec 06 09:43:22 compute-1 ceph-mon[79770]: 11.0 scrub ok
Dec 06 09:43:22 compute-1 ceph-mon[79770]: 11.a scrub starts
Dec 06 09:43:22 compute-1 ceph-mon[79770]: 11.a scrub ok
Dec 06 09:43:22 compute-1 ceph-mon[79770]: 11.1e scrub starts
Dec 06 09:43:22 compute-1 ceph-mon[79770]: 11.1e scrub ok
Dec 06 09:43:22 compute-1 ceph-mon[79770]: 11.c scrub starts
Dec 06 09:43:22 compute-1 ceph-mon[79770]: 11.c scrub ok
Dec 06 09:43:22 compute-1 ceph-mon[79770]: osdmap e90: 3 total, 3 up, 3 in
Dec 06 09:43:22 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.9 scrub starts
Dec 06 09:43:22 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.9 scrub ok
Dec 06 09:43:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:22.612+0000 7f263b68c140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 09:43:22 compute-1 ceph-mgr[80080]: mgr[py] Module rook has missing NOTIFY_TYPES member
Dec 06 09:43:22 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'selftest'
Dec 06 09:43:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:22.690+0000 7f263b68c140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 09:43:22 compute-1 ceph-mgr[80080]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Dec 06 09:43:22 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'snap_schedule'
Dec 06 09:43:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:22 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18001080 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:22.789+0000 7f263b68c140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 06 09:43:22 compute-1 ceph-mgr[80080]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Dec 06 09:43:22 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'stats'
Dec 06 09:43:22 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'status'
Dec 06 09:43:22 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:43:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:22.966+0000 7f263b68c140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 09:43:22 compute-1 ceph-mgr[80080]: mgr[py] Module status has missing NOTIFY_TYPES member
Dec 06 09:43:22 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'telegraf'
Dec 06 09:43:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:23.043+0000 7f263b68c140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'telemetry'
Dec 06 09:43:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:23.257+0000 7f263b68c140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'test_orchestrator'
Dec 06 09:43:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:23 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:23 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.d scrub starts
Dec 06 09:43:23 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.d scrub ok
Dec 06 09:43:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:23.505+0000 7f263b68c140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'volumes'
Dec 06 09:43:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e91 e91: 3 total, 3 up, 3 in
Dec 06 09:43:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:43:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:23.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:43:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:23.814+0000 7f263b68c140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: mgr[py] Loading python module 'zabbix'
Dec 06 09:43:23 compute-1 ceph-mon[79770]: 11.16 scrub starts
Dec 06 09:43:23 compute-1 ceph-mon[79770]: 11.16 scrub ok
Dec 06 09:43:23 compute-1 ceph-mon[79770]: 11.1c scrub starts
Dec 06 09:43:23 compute-1 ceph-mon[79770]: 11.1c scrub ok
Dec 06 09:43:23 compute-1 ceph-mon[79770]: 11.b scrub starts
Dec 06 09:43:23 compute-1 ceph-mon[79770]: 11.b scrub ok
Dec 06 09:43:23 compute-1 ceph-mon[79770]: 12.17 scrub starts
Dec 06 09:43:23 compute-1 ceph-mon[79770]: 12.17 scrub ok
Dec 06 09:43:23 compute-1 ceph-mon[79770]: 11.7 scrub starts
Dec 06 09:43:23 compute-1 ceph-mon[79770]: 11.7 scrub ok
Dec 06 09:43:23 compute-1 ceph-mon[79770]: 11.9 scrub starts
Dec 06 09:43:23 compute-1 ceph-mon[79770]: 11.9 scrub ok
Dec 06 09:43:23 compute-1 ceph-mon[79770]: Standby manager daemon compute-2.oazbvn restarted
Dec 06 09:43:23 compute-1 ceph-mon[79770]: Standby manager daemon compute-2.oazbvn started
Dec 06 09:43:23 compute-1 ceph-mon[79770]: Active manager daemon compute-0.qhdjwa restarted
Dec 06 09:43:23 compute-1 ceph-mon[79770]: Activating manager daemon compute-0.qhdjwa
Dec 06 09:43:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:23 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc001090 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 2025-12-06T09:43:23.893+0000 7f263b68c140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: mgr load Constructed class from module: dashboard
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: ms_deliver_dispatch: unhandled message 0x557a03a9d860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: mgr load Constructed class from module: prometheus
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: [dashboard INFO root] server: ssl=no host=192.168.122.101 port=8443
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: [dashboard INFO root] Configured CherryPy, starting engine...
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: [dashboard INFO root] Starting engine...
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: [prometheus INFO root] server_addr: :: server_port: 9283
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: [prometheus INFO root] Starting engine...
Dec 06 09:43:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: [06/Dec/2025:09:43:23] ENGINE Bus STARTING
Dec 06 09:43:23 compute-1 ceph-mgr[80080]: [prometheus INFO cherrypy.error] [06/Dec/2025:09:43:23] ENGINE Bus STARTING
Dec 06 09:43:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: CherryPy Checker:
Dec 06 09:43:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: The Application mounted at '' has an empty config.
Dec 06 09:43:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: 
Dec 06 09:43:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:23.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:24 compute-1 ceph-mgr[80080]: [dashboard INFO root] Engine started...
Dec 06 09:43:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: [06/Dec/2025:09:43:24] ENGINE Serving on http://:::9283
Dec 06 09:43:24 compute-1 ceph-mgr[80080]: [prometheus INFO cherrypy.error] [06/Dec/2025:09:43:24] ENGINE Serving on http://:::9283
Dec 06 09:43:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-mgr-compute-1-sauzid[80076]: [06/Dec/2025:09:43:24] ENGINE Bus STARTED
Dec 06 09:43:24 compute-1 ceph-mgr[80080]: [prometheus INFO cherrypy.error] [06/Dec/2025:09:43:24] ENGINE Bus STARTED
Dec 06 09:43:24 compute-1 ceph-mgr[80080]: [prometheus INFO root] Engine started.
Dec 06 09:43:24 compute-1 sshd-session[85840]: Accepted publickey for ceph-admin from 192.168.122.100 port 43742 ssh2: RSA SHA256:Gxeh0g0CuyN5zOpDUv+8o0JynyC1ASnaMny1857KGxo
Dec 06 09:43:24 compute-1 systemd-logind[788]: New session 37 of user ceph-admin.
Dec 06 09:43:24 compute-1 systemd[1]: Started Session 37 of User ceph-admin.
Dec 06 09:43:24 compute-1 sshd-session[85840]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Dec 06 09:43:24 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.2 scrub starts
Dec 06 09:43:24 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.2 scrub ok
Dec 06 09:43:24 compute-1 sudo[85845]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:43:24 compute-1 sudo[85845]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:24 compute-1 sudo[85845]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:24 compute-1 sudo[85870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 06 09:43:24 compute-1 sudo[85870]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:24 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:25 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18001080 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:25 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.18 scrub starts
Dec 06 09:43:25 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.18 scrub ok
Dec 06 09:43:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:25.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:25 compute-1 podman[85965]: 2025-12-06 09:43:25.81934101 +0000 UTC m=+0.091981698 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:43:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:25 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:43:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:25.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:43:25 compute-1 podman[85965]: 2025-12-06 09:43:25.959735623 +0000 UTC m=+0.232376311 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 09:43:26 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.1f scrub starts
Dec 06 09:43:26 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.1f scrub ok
Dec 06 09:43:26 compute-1 podman[86086]: 2025-12-06 09:43:26.456207324 +0000 UTC m=+0.066551439 container exec 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:43:26 compute-1 podman[86086]: 2025-12-06 09:43:26.468731824 +0000 UTC m=+0.079075949 container exec_died 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:43:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:26 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc002a80 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:26 compute-1 podman[86176]: 2025-12-06 09:43:26.818912761 +0000 UTC m=+0.063045780 container exec 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:43:26 compute-1 podman[86176]: 2025-12-06 09:43:26.833968275 +0000 UTC m=+0.078101294 container exec_died 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Dec 06 09:43:27 compute-1 podman[86243]: 2025-12-06 09:43:27.054816752 +0000 UTC m=+0.051950127 container exec 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec 06 09:43:27 compute-1 podman[86243]: 2025-12-06 09:43:27.065588226 +0000 UTC m=+0.062721581 container exec_died 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec 06 09:43:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:27 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 14 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:27 compute-1 podman[86309]: 2025-12-06 09:43:27.30043569 +0000 UTC m=+0.053521297 container exec c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., name=keepalived, version=2.2.4, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, distribution-scope=public, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, vcs-type=git)
Dec 06 09:43:27 compute-1 podman[86309]: 2025-12-06 09:43:27.318580353 +0000 UTC m=+0.071665930 container exec_died c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, version=2.2.4, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, vcs-type=git, architecture=x86_64, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph)
Dec 06 09:43:27 compute-1 sudo[85870]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:27 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.10 deep-scrub starts
Dec 06 09:43:27 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.10 deep-scrub ok
Dec 06 09:43:27 compute-1 ceph-mon[79770]: 12.9 scrub starts
Dec 06 09:43:27 compute-1 ceph-mon[79770]: 12.9 scrub ok
Dec 06 09:43:27 compute-1 ceph-mon[79770]: 11.5 deep-scrub starts
Dec 06 09:43:27 compute-1 ceph-mon[79770]: 11.5 deep-scrub ok
Dec 06 09:43:27 compute-1 ceph-mon[79770]: 11.d scrub starts
Dec 06 09:43:27 compute-1 ceph-mon[79770]: 11.d scrub ok
Dec 06 09:43:27 compute-1 ceph-mon[79770]: osdmap e91: 3 total, 3 up, 3 in
Dec 06 09:43:27 compute-1 ceph-mon[79770]: mgrmap e31: compute-0.qhdjwa(active, starting, since 0.658318s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.ujokui"}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.fpvjgb"}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.czucwy"}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr metadata", "who": "compute-0.qhdjwa", "id": "compute-0.qhdjwa"}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr metadata", "who": "compute-1.sauzid", "id": "compute-1.sauzid"}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr metadata", "who": "compute-2.oazbvn", "id": "compute-2.oazbvn"}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mds metadata"}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mon metadata"}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: Manager daemon compute-0.qhdjwa is now available
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: Standby manager daemon compute-1.sauzid restarted
Dec 06 09:43:27 compute-1 ceph-mon[79770]: Standby manager daemon compute-1.sauzid started
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/mirror_snapshot_schedule"}]: dispatch
Dec 06 09:43:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.qhdjwa/trash_purge_schedule"}]: dispatch
Dec 06 09:43:27 compute-1 sudo[86342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:43:27 compute-1 sudo[86342]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:27 compute-1 sudo[86342]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:27.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:27 compute-1 sudo[86367]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:43:27 compute-1 sudo[86367]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:27 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb18002470 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:43:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:27.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:28 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.11 scrub starts
Dec 06 09:43:28 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.11 scrub ok
Dec 06 09:43:28 compute-1 sudo[86367]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:28 compute-1 sudo[86424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:43:28 compute-1 sudo[86424]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:28 compute-1 sudo[86424]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:28 compute-1 sudo[86449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Dec 06 09:43:28 compute-1 sudo[86449]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 12.7 deep-scrub starts
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 12.7 deep-scrub ok
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.4 scrub starts
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.4 scrub ok
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.2 scrub starts
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.2 scrub ok
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 12.11 scrub starts
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 12.11 scrub ok
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.f scrub starts
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.f scrub ok
Dec 06 09:43:28 compute-1 ceph-mon[79770]: [06/Dec/2025:09:43:25] ENGINE Bus STARTING
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.18 scrub starts
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.18 scrub ok
Dec 06 09:43:28 compute-1 ceph-mon[79770]: [06/Dec/2025:09:43:25] ENGINE Serving on https://192.168.122.100:7150
Dec 06 09:43:28 compute-1 ceph-mon[79770]: [06/Dec/2025:09:43:25] ENGINE Client ('192.168.122.100', 44988) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Dec 06 09:43:28 compute-1 ceph-mon[79770]: [06/Dec/2025:09:43:25] ENGINE Serving on http://192.168.122.100:8765
Dec 06 09:43:28 compute-1 ceph-mon[79770]: [06/Dec/2025:09:43:25] ENGINE Bus STARTED
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 12.1a deep-scrub starts
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.1d scrub starts
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.1f scrub starts
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.1f scrub ok
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 12.1a deep-scrub ok
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 12.2 scrub starts
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.10 deep-scrub starts
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.10 deep-scrub ok
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.1d scrub ok
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 12.2 scrub ok
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.12 scrub starts
Dec 06 09:43:28 compute-1 ceph-mon[79770]: 11.12 scrub ok
Dec 06 09:43:28 compute-1 ceph-mon[79770]: mgrmap e32: compute-0.qhdjwa(active, since 4s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:43:28 compute-1 ceph-mon[79770]: pgmap v3: 337 pgs: 337 active+clean; 458 KiB data, 143 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:43:28 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:28 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:28 compute-1 ceph-mon[79770]: pgmap v4: 337 pgs: 337 active+clean; 458 KiB data, 143 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:43:28 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]: dispatch
Dec 06 09:43:28 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Dec 06 09:43:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e92 e92: 3 total, 3 up, 3 in
Dec 06 09:43:28 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 92 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=9 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=92 pruub=14.626732826s) [1] r=-1 lpr=92 pi=[68,92)/1 crt=51'1027 mlcod 0'0 active pruub 247.411544800s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:28 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 92 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=9 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=92 pruub=14.626684189s) [1] r=-1 lpr=92 pi=[68,92)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 247.411544800s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:28 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 92 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=67/68 n=4 ec=58/45 lis/c=67/67 les/c/f=68/68/0 sis=92 pruub=13.612200737s) [1] r=-1 lpr=92 pi=[67,92)/1 crt=51'1027 mlcod 0'0 active pruub 246.398071289s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:28 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 92 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=67/68 n=4 ec=58/45 lis/c=67/67 les/c/f=68/68/0 sis=92 pruub=13.612160683s) [1] r=-1 lpr=92 pi=[67,92)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 246.398071289s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:28 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 92 pg[6.b( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=64/64 les/c/f=65/65/0 sis=92) [0] r=0 lpr=92 pi=[64,92)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:28 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:28 compute-1 sudo[86449]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:29 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:29 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.6 scrub starts
Dec 06 09:43:29 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.6 scrub ok
Dec 06 09:43:29 compute-1 ceph-mon[79770]: 11.8 scrub starts
Dec 06 09:43:29 compute-1 ceph-mon[79770]: 11.8 scrub ok
Dec 06 09:43:29 compute-1 ceph-mon[79770]: 11.11 scrub starts
Dec 06 09:43:29 compute-1 ceph-mon[79770]: 11.11 scrub ok
Dec 06 09:43:29 compute-1 ceph-mon[79770]: 11.1 scrub starts
Dec 06 09:43:29 compute-1 ceph-mon[79770]: 11.1 scrub ok
Dec 06 09:43:29 compute-1 ceph-mon[79770]: mgrmap e33: compute-0.qhdjwa(active, since 5s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:43:29 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "12"}]': finished
Dec 06 09:43:29 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Dec 06 09:43:29 compute-1 ceph-mon[79770]: osdmap e92: 3 total, 3 up, 3 in
Dec 06 09:43:29 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:29 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:29 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec 06 09:43:29 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Dec 06 09:43:29 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:29 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e93 e93: 3 total, 3 up, 3 in
Dec 06 09:43:29 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 93 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=9 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=93) [1]/[0] r=0 lpr=93 pi=[68,93)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:29 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 93 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=9 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=93) [1]/[0] r=0 lpr=93 pi=[68,93)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:29 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 93 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=67/68 n=4 ec=58/45 lis/c=67/67 les/c/f=68/68/0 sis=93) [1]/[0] r=0 lpr=93 pi=[67,93)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:29 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 93 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=67/68 n=4 ec=58/45 lis/c=67/67 les/c/f=68/68/0 sis=93) [1]/[0] r=0 lpr=93 pi=[67,93)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:29 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 93 pg[6.b( v 50'39 lc 0'0 (0'0,50'39] local-lis/les=92/93 n=1 ec=54/21 lis/c=64/64 les/c/f=65/65/0 sis=92) [0] r=0 lpr=92 pi=[64,92)/1 crt=50'39 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:43:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:29.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:43:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:29 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:29.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:30 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.15 scrub starts
Dec 06 09:43:30 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 11.15 scrub ok
Dec 06 09:43:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:30 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:30 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e94 e94: 3 total, 3 up, 3 in
Dec 06 09:43:30 compute-1 ceph-mon[79770]: 11.19 scrub starts
Dec 06 09:43:30 compute-1 ceph-mon[79770]: 11.19 scrub ok
Dec 06 09:43:30 compute-1 ceph-mon[79770]: 11.6 scrub starts
Dec 06 09:43:30 compute-1 ceph-mon[79770]: 11.6 scrub ok
Dec 06 09:43:30 compute-1 ceph-mon[79770]: 11.1b scrub starts
Dec 06 09:43:30 compute-1 ceph-mon[79770]: 11.1b scrub ok
Dec 06 09:43:30 compute-1 ceph-mon[79770]: osdmap e93: 3 total, 3 up, 3 in
Dec 06 09:43:30 compute-1 ceph-mon[79770]: pgmap v7: 337 pgs: 337 active+clean; 458 KiB data, 143 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:43:30 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]: dispatch
Dec 06 09:43:30 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Dec 06 09:43:30 compute-1 ceph-mon[79770]: mgrmap e34: compute-0.qhdjwa(active, since 6s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:43:30 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:30 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:30 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec 06 09:43:30 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:30 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:30 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 94 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=93/94 n=9 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=93) [1]/[0] async=[1] r=0 lpr=93 pi=[68,93)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=9}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:30 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 94 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=93/94 n=4 ec=58/45 lis/c=67/67 les/c/f=68/68/0 sis=93) [1]/[0] async=[1] r=0 lpr=93 pi=[67,93)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:31 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:31 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.10 scrub starts
Dec 06 09:43:31 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.10 scrub ok
Dec 06 09:43:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:31.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:31 compute-1 ceph-mon[79770]: 12.3 scrub starts
Dec 06 09:43:31 compute-1 ceph-mon[79770]: 12.3 scrub ok
Dec 06 09:43:31 compute-1 ceph-mon[79770]: 11.15 scrub starts
Dec 06 09:43:31 compute-1 ceph-mon[79770]: 11.15 scrub ok
Dec 06 09:43:31 compute-1 ceph-mon[79770]: 11.14 scrub starts
Dec 06 09:43:31 compute-1 ceph-mon[79770]: 11.14 scrub ok
Dec 06 09:43:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "13"}]': finished
Dec 06 09:43:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Dec 06 09:43:31 compute-1 ceph-mon[79770]: osdmap e94: 3 total, 3 up, 3 in
Dec 06 09:43:31 compute-1 ceph-mon[79770]: 10.17 scrub starts
Dec 06 09:43:31 compute-1 ceph-mon[79770]: 10.17 scrub ok
Dec 06 09:43:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]: dispatch
Dec 06 09:43:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Dec 06 09:43:31 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e95 e95: 3 total, 3 up, 3 in
Dec 06 09:43:31 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 95 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=93/94 n=9 ec=58/45 lis/c=93/68 les/c/f=94/69/0 sis=95 pruub=15.010676384s) [1] async=[1] r=-1 lpr=95 pi=[68,95)/1 crt=51'1027 mlcod 51'1027 active pruub 250.848419189s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:31 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 95 pg[10.a( v 51'1027 (0'0,51'1027] local-lis/les=93/94 n=9 ec=58/45 lis/c=93/68 les/c/f=94/69/0 sis=95 pruub=15.010571480s) [1] r=-1 lpr=95 pi=[68,95)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 250.848419189s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:31 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 95 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=93/94 n=4 ec=58/45 lis/c=93/67 les/c/f=94/68/0 sis=95 pruub=15.009842873s) [1] async=[1] r=-1 lpr=95 pi=[67,95)/1 crt=51'1027 mlcod 51'1027 active pruub 250.848556519s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:31 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 95 pg[10.1a( v 51'1027 (0'0,51'1027] local-lis/les=93/94 n=4 ec=58/45 lis/c=93/67 les/c/f=94/68/0 sis=95 pruub=15.009781837s) [1] r=-1 lpr=95 pi=[67,95)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 250.848556519s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:31 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:43:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:31.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:43:32 compute-1 sudo[86493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 09:43:32 compute-1 sudo[86493]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:32 compute-1 sudo[86493]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:32 compute-1 sudo[86518]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph
Dec 06 09:43:32 compute-1 sudo[86518]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:32 compute-1 sudo[86518]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:32 compute-1 sudo[86543]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:43:32 compute-1 sudo[86543]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:32 compute-1 sudo[86543]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:32 compute-1 sudo[86568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:43:32 compute-1 sudo[86568]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:32 compute-1 sudo[86568]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:32 compute-1 sudo[86594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:43:32 compute-1 sudo[86594]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:32 compute-1 sudo[86594]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:32 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.c scrub starts
Dec 06 09:43:32 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.c scrub ok
Dec 06 09:43:32 compute-1 sudo[86642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:43:32 compute-1 sudo[86642]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:32 compute-1 sudo[86642]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:32 compute-1 sudo[86667]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new
Dec 06 09:43:32 compute-1 sudo[86667]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:32 compute-1 sudo[86667]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:32 compute-1 sudo[86692]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Dec 06 09:43:32 compute-1 sudo[86692]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:32 compute-1 sudo[86692]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:32 compute-1 sudo[86717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config
Dec 06 09:43:32 compute-1 sudo[86717]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:32 compute-1 sudo[86717]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:32 compute-1 sudo[86742]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config
Dec 06 09:43:32 compute-1 sudo[86742]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:32 compute-1 sudo[86742]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:32 compute-1 sudo[86767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:43:32 compute-1 sudo[86767]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:32 compute-1 sudo[86767]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:32 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:32 compute-1 sudo[86792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:43:32 compute-1 sudo[86792]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:32 compute-1 sudo[86792]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:43:32 compute-1 sudo[86817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:43:32 compute-1 sudo[86817]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:32 compute-1 sudo[86817]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:33 compute-1 sudo[86865]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:43:33 compute-1 sudo[86865]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:33 compute-1 sudo[86865]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:33 compute-1 sudo[86890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new
Dec 06 09:43:33 compute-1 sudo[86890]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:33 compute-1 sudo[86890]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:33 compute-1 sudo[86915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf.new /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec 06 09:43:33 compute-1 sudo[86915]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:33 compute-1 sudo[86915]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:33 compute-1 sudo[86940]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Dec 06 09:43:33 compute-1 sudo[86940]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:33 compute-1 sudo[86940]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:33 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:33 compute-1 sudo[86965]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph
Dec 06 09:43:33 compute-1 sudo[86965]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:33 compute-1 sudo[86965]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:33 compute-1 sudo[86990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:43:33 compute-1 sudo[86990]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:33 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.a scrub starts
Dec 06 09:43:33 compute-1 sudo[86990]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:33 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.a scrub ok
Dec 06 09:43:33 compute-1 sudo[87015]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:43:33 compute-1 sudo[87015]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:33 compute-1 sudo[87015]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:33 compute-1 sudo[87040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:43:33 compute-1 sudo[87040]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:33 compute-1 sudo[87040]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:33 compute-1 sudo[87088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:43:33 compute-1 sudo[87088]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:33 compute-1 sudo[87088]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:33 compute-1 sudo[87113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new
Dec 06 09:43:33 compute-1 sudo[87113]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:33 compute-1 sudo[87113]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:33.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:33 compute-1 sudo[87138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Dec 06 09:43:33 compute-1 sudo[87138]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:33 compute-1 sudo[87138]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:33 compute-1 sudo[87163]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config
Dec 06 09:43:33 compute-1 sudo[87163]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:33 compute-1 sudo[87163]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:33 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:43:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:33.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:43:33 compute-1 sudo[87188]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config
Dec 06 09:43:33 compute-1 sudo[87188]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:33 compute-1 sudo[87188]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:34 compute-1 sudo[87213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring.new
Dec 06 09:43:34 compute-1 sudo[87213]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:34 compute-1 sudo[87213]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:34 compute-1 sudo[87238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:43:34 compute-1 sudo[87238]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:34 compute-1 sudo[87238]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:34 compute-1 sudo[87263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring.new
Dec 06 09:43:34 compute-1 sudo[87263]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:34 compute-1 sudo[87263]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:34 compute-1 sudo[87312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring.new
Dec 06 09:43:34 compute-1 sudo[87312]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:34 compute-1 sudo[87312]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:34 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.b scrub starts
Dec 06 09:43:34 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.b scrub ok
Dec 06 09:43:34 compute-1 sudo[87337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring.new
Dec 06 09:43:34 compute-1 sudo[87337]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:34 compute-1 sudo[87337]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:34 compute-1 sudo[87362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-5ecd3f74-dade-5fc4-92ce-8950ae424258/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring.new /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec 06 09:43:34 compute-1 sudo[87362]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:34 compute-1 sudo[87362]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:34 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:35 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e96 e96: 3 total, 3 up, 3 in
Dec 06 09:43:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:35 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:35 compute-1 ceph-mon[79770]: 12.1d scrub starts
Dec 06 09:43:35 compute-1 ceph-mon[79770]: 12.1d scrub ok
Dec 06 09:43:35 compute-1 ceph-mon[79770]: 12.10 scrub starts
Dec 06 09:43:35 compute-1 ceph-mon[79770]: 12.10 scrub ok
Dec 06 09:43:35 compute-1 ceph-mon[79770]: pgmap v9: 337 pgs: 337 active+clean; 458 KiB data, 143 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:43:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "14"}]': finished
Dec 06 09:43:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Dec 06 09:43:35 compute-1 ceph-mon[79770]: osdmap e95: 3 total, 3 up, 3 in
Dec 06 09:43:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 06 09:43:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:43:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:43:35 compute-1 ceph-mon[79770]: Updating compute-0:/etc/ceph/ceph.conf
Dec 06 09:43:35 compute-1 ceph-mon[79770]: Updating compute-1:/etc/ceph/ceph.conf
Dec 06 09:43:35 compute-1 ceph-mon[79770]: Updating compute-2:/etc/ceph/ceph.conf
Dec 06 09:43:35 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.e scrub starts
Dec 06 09:43:35 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.e scrub ok
Dec 06 09:43:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:35.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:35 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb2400a640 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:35.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:36 compute-1 sshd-session[87387]: Accepted publickey for zuul from 192.168.122.30 port 38118 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:43:36 compute-1 systemd-logind[788]: New session 38 of user zuul.
Dec 06 09:43:36 compute-1 systemd[1]: Started Session 38 of User zuul.
Dec 06 09:43:36 compute-1 sshd-session[87387]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:43:36 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.19 scrub starts
Dec 06 09:43:36 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.19 scrub ok
Dec 06 09:43:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:36 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feafc001f70 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:36 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e97 e97: 3 total, 3 up, 3 in
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.4 deep-scrub starts
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.4 deep-scrub ok
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.c scrub starts
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.c scrub ok
Dec 06 09:43:36 compute-1 ceph-mon[79770]: Updating compute-2:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec 06 09:43:36 compute-1 ceph-mon[79770]: Updating compute-1:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec 06 09:43:36 compute-1 ceph-mon[79770]: Updating compute-0:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.conf
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 10.0 scrub starts
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 10.0 scrub ok
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.1e scrub starts
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.1e scrub ok
Dec 06 09:43:36 compute-1 ceph-mon[79770]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Dec 06 09:43:36 compute-1 ceph-mon[79770]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Dec 06 09:43:36 compute-1 ceph-mon[79770]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.a scrub starts
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.a scrub ok
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 10.f scrub starts
Dec 06 09:43:36 compute-1 ceph-mon[79770]: Updating compute-2:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec 06 09:43:36 compute-1 ceph-mon[79770]: pgmap v11: 337 pgs: 2 unknown, 2 remapped+peering, 2 peering, 1 active+clean+scrubbing, 330 active+clean; 458 KiB data, 143 MiB used, 60 GiB / 60 GiB avail; 21 B/s, 0 objects/s recovering
Dec 06 09:43:36 compute-1 ceph-mon[79770]: Updating compute-1:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.13 scrub starts
Dec 06 09:43:36 compute-1 ceph-mon[79770]: Updating compute-0:/var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/config/ceph.client.admin.keyring
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.13 scrub ok
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.b scrub starts
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.b scrub ok
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 6.3 deep-scrub starts
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 10.f scrub ok
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.18 scrub starts
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.18 scrub ok
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 6.3 deep-scrub ok
Dec 06 09:43:36 compute-1 ceph-mon[79770]: osdmap e96: 3 total, 3 up, 3 in
Dec 06 09:43:36 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.e scrub starts
Dec 06 09:43:36 compute-1 ceph-mon[79770]: 12.e scrub ok
Dec 06 09:43:36 compute-1 ceph-mon[79770]: pgmap v13: 337 pgs: 2 unknown, 2 remapped+peering, 2 peering, 1 active+clean+scrubbing, 330 active+clean; 458 KiB data, 143 MiB used, 60 GiB / 60 GiB avail; 18 B/s, 0 objects/s recovering
Dec 06 09:43:36 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:36 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:36 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:36 compute-1 python3.9[87541]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 06 09:43:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:37 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb200045e0 fd 47 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:43:37 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.8 scrub starts
Dec 06 09:43:37 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.8 scrub ok
Dec 06 09:43:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:43:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:37.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:43:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e98 e98: 3 total, 3 up, 3 in
Dec 06 09:43:37 compute-1 ceph-mon[79770]: 12.19 scrub starts
Dec 06 09:43:37 compute-1 ceph-mon[79770]: 12.19 scrub ok
Dec 06 09:43:37 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:37 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:37 compute-1 ceph-mon[79770]: osdmap e97: 3 total, 3 up, 3 in
Dec 06 09:43:37 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:37 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:37 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:43:37 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:43:37 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:43:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[84522]: 06/12/2025 09:43:37 : epoch 6933fa6b : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7feb00004050 fd 47 proxy ignored for local
Dec 06 09:43:37 compute-1 kernel: ganesha.nfsd[84964]: segfault at 50 ip 00007febd392032e sp 00007feb8d7f9210 error 4 in libntirpc.so.5.8[7febd3905000+2c000] likely on CPU 2 (core 0, socket 2)
Dec 06 09:43:37 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 06 09:43:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:43:37 compute-1 systemd[1]: Created slice Slice /system/systemd-coredump.
Dec 06 09:43:37 compute-1 systemd[1]: Started Process Core Dump (PID 87688/UID 0).
Dec 06 09:43:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:37.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:38 compute-1 python3.9[87717]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:43:38 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.6 deep-scrub starts
Dec 06 09:43:38 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.6 deep-scrub ok
Dec 06 09:43:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e99 e99: 3 total, 3 up, 3 in
Dec 06 09:43:38 compute-1 ceph-mon[79770]: 12.8 scrub starts
Dec 06 09:43:38 compute-1 ceph-mon[79770]: 12.8 scrub ok
Dec 06 09:43:38 compute-1 ceph-mon[79770]: pgmap v15: 337 pgs: 1 peering, 3 active+remapped, 333 active+clean; 458 KiB data, 143 MiB used, 60 GiB / 60 GiB avail; 43 B/s, 1 objects/s recovering
Dec 06 09:43:38 compute-1 ceph-mon[79770]: osdmap e98: 3 total, 3 up, 3 in
Dec 06 09:43:38 compute-1 ceph-mon[79770]: 10.1b scrub starts
Dec 06 09:43:38 compute-1 ceph-mon[79770]: 10.1b scrub ok
Dec 06 09:43:38 compute-1 ceph-mon[79770]: osdmap e99: 3 total, 3 up, 3 in
Dec 06 09:43:39 compute-1 sudo[87872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csflyrpviolxlwftydtufylhymsfztah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014218.765116-94-217934205838295/AnsiballZ_command.py'
Dec 06 09:43:39 compute-1 sudo[87872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:39 compute-1 python3.9[87874]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:43:39 compute-1 sudo[87872]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:39 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.12 scrub starts
Dec 06 09:43:39 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.12 scrub ok
Dec 06 09:43:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:43:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:39.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:43:39 compute-1 ceph-mon[79770]: 12.6 deep-scrub starts
Dec 06 09:43:39 compute-1 ceph-mon[79770]: 12.6 deep-scrub ok
Dec 06 09:43:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:43:39 compute-1 ceph-mon[79770]: 10.b scrub starts
Dec 06 09:43:39 compute-1 ceph-mon[79770]: 10.b scrub ok
Dec 06 09:43:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:39.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:40 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.1c scrub starts
Dec 06 09:43:40 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 12.1c scrub ok
Dec 06 09:43:40 compute-1 sudo[88026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyuikbrpfyjmpkvylisbebvdjwenfyki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014219.9041564-130-278622694503570/AnsiballZ_stat.py'
Dec 06 09:43:40 compute-1 sudo[88026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:40 compute-1 systemd-coredump[87690]: Process 84526 (ganesha.nfsd) of user 0 dumped core.
                                                   
                                                   Stack trace of thread 51:
                                                   #0  0x00007febd392032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                   ELF object binary architecture: AMD x86-64
Dec 06 09:43:40 compute-1 python3.9[88028]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:43:40 compute-1 sudo[88026]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:40 compute-1 systemd[1]: systemd-coredump@0-87688-0.service: Deactivated successfully.
Dec 06 09:43:40 compute-1 systemd[1]: systemd-coredump@0-87688-0.service: Consumed 2.890s CPU time.
Dec 06 09:43:40 compute-1 ceph-mon[79770]: 12.12 scrub starts
Dec 06 09:43:40 compute-1 ceph-mon[79770]: 12.12 scrub ok
Dec 06 09:43:40 compute-1 ceph-mon[79770]: pgmap v18: 337 pgs: 1 peering, 3 active+remapped, 333 active+clean; 458 KiB data, 143 MiB used, 60 GiB / 60 GiB avail; 34 B/s, 2 objects/s recovering
Dec 06 09:43:40 compute-1 ceph-mon[79770]: 6.2 deep-scrub starts
Dec 06 09:43:40 compute-1 ceph-mon[79770]: 6.2 deep-scrub ok
Dec 06 09:43:40 compute-1 podman[88043]: 2025-12-06 09:43:40.963694823 +0000 UTC m=+0.029622557 container died 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 09:43:40 compute-1 systemd[1]: var-lib-containers-storage-overlay-fd7e7e9e4ddac3e74b3b7bc6b20dd5bb2fcc490030f679e68f53a0a8ada38ac6-merged.mount: Deactivated successfully.
Dec 06 09:43:40 compute-1 podman[88043]: 2025-12-06 09:43:40.998903992 +0000 UTC m=+0.064831326 container remove 2b1801986393e8e2cbe7b4cdadc22f24012f42b9768a29cb7ee64c55eabe33b4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:43:41 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec 06 09:43:41 compute-1 systemd[81504]: Starting Mark boot as successful...
Dec 06 09:43:41 compute-1 systemd[81504]: Finished Mark boot as successful.
Dec 06 09:43:41 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec 06 09:43:41 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.627s CPU time.
Dec 06 09:43:41 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.12 scrub starts
Dec 06 09:43:41 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.12 scrub ok
Dec 06 09:43:41 compute-1 sudo[88227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ancgdmavslxnnajujgxtyqqiokxqnqdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014221.1717453-163-170375899581259/AnsiballZ_file.py'
Dec 06 09:43:41 compute-1 sudo[88227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:41.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:41 compute-1 python3.9[88229]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:43:41 compute-1 sudo[88227]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:41.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:42 compute-1 ceph-mon[79770]: 12.1c scrub starts
Dec 06 09:43:42 compute-1 ceph-mon[79770]: 12.1c scrub ok
Dec 06 09:43:42 compute-1 ceph-mon[79770]: 11.e scrub starts
Dec 06 09:43:42 compute-1 ceph-mon[79770]: 11.e scrub ok
Dec 06 09:43:42 compute-1 ceph-mon[79770]: 6.7 scrub starts
Dec 06 09:43:42 compute-1 sudo[88380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qltdnnzultfzdnkrurpscjrhrizfvvpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014222.0380774-190-150796153330576/AnsiballZ_file.py'
Dec 06 09:43:42 compute-1 sudo[88380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:42 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.d deep-scrub starts
Dec 06 09:43:42 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.d deep-scrub ok
Dec 06 09:43:42 compute-1 python3.9[88382]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:43:42 compute-1 sudo[88380]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:43:43 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.8 deep-scrub starts
Dec 06 09:43:43 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.8 deep-scrub ok
Dec 06 09:43:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:43.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:43.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:44 compute-1 ceph-mon[79770]: 10.12 scrub starts
Dec 06 09:43:44 compute-1 ceph-mon[79770]: 10.12 scrub ok
Dec 06 09:43:44 compute-1 ceph-mon[79770]: 6.7 scrub ok
Dec 06 09:43:44 compute-1 ceph-mon[79770]: pgmap v19: 337 pgs: 1 peering, 3 active+remapped, 333 active+clean; 458 KiB data, 143 MiB used, 60 GiB / 60 GiB avail; 25 B/s, 1 objects/s recovering
Dec 06 09:43:44 compute-1 ceph-mon[79770]: 11.3 scrub starts
Dec 06 09:43:44 compute-1 ceph-mon[79770]: 11.3 scrub ok
Dec 06 09:43:44 compute-1 ceph-mon[79770]: 10.d deep-scrub starts
Dec 06 09:43:44 compute-1 ceph-mon[79770]: 10.d deep-scrub ok
Dec 06 09:43:44 compute-1 ceph-mon[79770]: 6.a scrub starts
Dec 06 09:43:44 compute-1 ceph-mon[79770]: 6.a scrub ok
Dec 06 09:43:44 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:44 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.e scrub starts
Dec 06 09:43:44 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.e scrub ok
Dec 06 09:43:44 compute-1 sudo[88489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:43:44 compute-1 sudo[88489]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:44 compute-1 sudo[88489]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:44 compute-1 sudo[88551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:43:44 compute-1 sudo[88551]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:44 compute-1 sudo[88551]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:44 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e100 e100: 3 total, 3 up, 3 in
Dec 06 09:43:44 compute-1 python3.9[88566]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:43:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 100 pg[6.e( v 50'39 (0'0,50'39] local-lis/les=76/77 n=1 ec=54/21 lis/c=76/76 les/c/f=77/77/0 sis=100 pruub=8.985174179s) [1] r=-1 lpr=100 pi=[76,100)/1 crt=50'39 mlcod 50'39 active pruub 257.989654541s@ mbc={255={}}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 100 pg[6.e( v 50'39 (0'0,50'39] local-lis/les=76/77 n=1 ec=54/21 lis/c=76/76 les/c/f=77/77/0 sis=100 pruub=8.984956741s) [1] r=-1 lpr=100 pi=[76,100)/1 crt=50'39 mlcod 0'0 unknown NOTIFY pruub 257.989654541s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 100 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=8 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=100 pruub=12.035808563s) [1] r=-1 lpr=100 pi=[79,100)/1 crt=51'1027 mlcod 0'0 active pruub 261.040283203s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 100 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=8 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=100 pruub=12.035377502s) [1] r=-1 lpr=100 pi=[79,100)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 261.040283203s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 100 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=5 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=100 pruub=12.038669586s) [1] r=-1 lpr=100 pi=[79,100)/1 crt=51'1027 mlcod 0'0 active pruub 261.043975830s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:44 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 100 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=5 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=100 pruub=12.038622856s) [1] r=-1 lpr=100 pi=[79,100)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 261.043975830s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:45 compute-1 network[88600]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:43:45 compute-1 network[88601]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:43:45 compute-1 network[88602]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:43:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094345 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:43:45 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.5 scrub starts
Dec 06 09:43:45 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.5 scrub ok
Dec 06 09:43:45 compute-1 ceph-mon[79770]: 6.1 scrub starts
Dec 06 09:43:45 compute-1 ceph-mon[79770]: 6.1 scrub ok
Dec 06 09:43:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:45 compute-1 ceph-mon[79770]: 10.8 deep-scrub starts
Dec 06 09:43:45 compute-1 ceph-mon[79770]: 10.8 deep-scrub ok
Dec 06 09:43:45 compute-1 ceph-mon[79770]: 10.6 scrub starts
Dec 06 09:43:45 compute-1 ceph-mon[79770]: 10.6 scrub ok
Dec 06 09:43:45 compute-1 ceph-mon[79770]: pgmap v20: 337 pgs: 337 active+clean; 458 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:43:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]: dispatch
Dec 06 09:43:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Dec 06 09:43:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:45 compute-1 ceph-mon[79770]: 10.19 scrub starts
Dec 06 09:43:45 compute-1 ceph-mon[79770]: 10.19 scrub ok
Dec 06 09:43:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "15"}]': finished
Dec 06 09:43:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Dec 06 09:43:45 compute-1 ceph-mon[79770]: osdmap e100: 3 total, 3 up, 3 in
Dec 06 09:43:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec 06 09:43:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Dec 06 09:43:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:43:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:45.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:45.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:46 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.18 scrub starts
Dec 06 09:43:46 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.18 scrub ok
Dec 06 09:43:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e101 e101: 3 total, 3 up, 3 in
Dec 06 09:43:47 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 101 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=8 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=101) [1]/[0] r=0 lpr=101 pi=[79,101)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:47 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 101 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=8 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=101) [1]/[0] r=0 lpr=101 pi=[79,101)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:47 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 101 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=5 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=101) [1]/[0] r=0 lpr=101 pi=[79,101)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:47 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 101 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=79/80 n=5 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=101) [1]/[0] r=0 lpr=101 pi=[79,101)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:47 compute-1 ceph-mon[79770]: 10.1 scrub starts
Dec 06 09:43:47 compute-1 ceph-mon[79770]: 10.1 scrub ok
Dec 06 09:43:47 compute-1 ceph-mon[79770]: 6.e scrub starts
Dec 06 09:43:47 compute-1 ceph-mon[79770]: 6.e scrub ok
Dec 06 09:43:47 compute-1 ceph-mon[79770]: Reconfiguring mon.compute-0 (monmap changed)...
Dec 06 09:43:47 compute-1 ceph-mon[79770]: Reconfiguring daemon mon.compute-0 on compute-0
Dec 06 09:43:47 compute-1 ceph-mon[79770]: 10.5 scrub starts
Dec 06 09:43:47 compute-1 ceph-mon[79770]: 10.5 scrub ok
Dec 06 09:43:47 compute-1 ceph-mon[79770]: 10.c scrub starts
Dec 06 09:43:47 compute-1 ceph-mon[79770]: 10.c scrub ok
Dec 06 09:43:47 compute-1 ceph-mon[79770]: pgmap v22: 337 pgs: 337 active+clean; 458 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:43:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]: dispatch
Dec 06 09:43:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Dec 06 09:43:47 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.b scrub starts
Dec 06 09:43:47 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 6.b scrub ok
Dec 06 09:43:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:43:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:47.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:43:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:43:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:47.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:48 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.15 scrub starts
Dec 06 09:43:48 compute-1 ceph-osd[77465]: log_channel(cluster) log [DBG] : 10.15 scrub ok
Dec 06 09:43:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:43:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:49.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:43:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:50.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:50 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e102 e102: 3 total, 3 up, 3 in
Dec 06 09:43:50 compute-1 ceph-mon[79770]: 10.4 scrub starts
Dec 06 09:43:50 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 102 pg[6.f( empty local-lis/les=0/0 n=0 ec=54/21 lis/c=64/64 les/c/f=65/65/0 sis=102) [0] r=0 lpr=102 pi=[64,102)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:43:50 compute-1 ceph-mon[79770]: 10.4 scrub ok
Dec 06 09:43:50 compute-1 ceph-mon[79770]: 10.3 scrub starts
Dec 06 09:43:50 compute-1 ceph-mon[79770]: 10.3 scrub ok
Dec 06 09:43:50 compute-1 ceph-mon[79770]: 10.18 scrub starts
Dec 06 09:43:50 compute-1 ceph-mon[79770]: 10.18 scrub ok
Dec 06 09:43:50 compute-1 ceph-mon[79770]: 10.1c scrub starts
Dec 06 09:43:50 compute-1 ceph-mon[79770]: 10.1c scrub ok
Dec 06 09:43:50 compute-1 ceph-mon[79770]: osdmap e101: 3 total, 3 up, 3 in
Dec 06 09:43:50 compute-1 ceph-mon[79770]: 10.14 scrub starts
Dec 06 09:43:50 compute-1 ceph-mon[79770]: 10.14 scrub ok
Dec 06 09:43:50 compute-1 ceph-mon[79770]: 6.b scrub starts
Dec 06 09:43:50 compute-1 ceph-mon[79770]: 6.b scrub ok
Dec 06 09:43:50 compute-1 ceph-mon[79770]: 10.1e scrub starts
Dec 06 09:43:50 compute-1 ceph-mon[79770]: pgmap v24: 337 pgs: 2 remapped+peering, 335 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:43:50 compute-1 ceph-mon[79770]: 10.1e scrub ok
Dec 06 09:43:50 compute-1 ceph-mon[79770]: 10.10 scrub starts
Dec 06 09:43:50 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 102 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=101/102 n=8 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=101) [1]/[0] async=[1] r=0 lpr=101 pi=[79,101)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:50 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 102 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=101/102 n=5 ec=58/45 lis/c=79/79 les/c/f=80/80/0 sis=101) [1]/[0] async=[1] r=0 lpr=101 pi=[79,101)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:51 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 1.
Dec 06 09:43:51 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:43:51 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.627s CPU time.
Dec 06 09:43:51 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:43:51 compute-1 python3.9[88874]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:43:51 compute-1 podman[88914]: 2025-12-06 09:43:51.461902961 +0000 UTC m=+0.062415184 container create 90da3924cacc3efa403ea45549b6824092497428ac34121bd882ee64a78e789c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:43:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f61a3b3a24536c7c4e747ec6bc4b7c9a3e3b3c6a417aa3b9f3cf3ab295997eb4/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 06 09:43:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f61a3b3a24536c7c4e747ec6bc4b7c9a3e3b3c6a417aa3b9f3cf3ab295997eb4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:43:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f61a3b3a24536c7c4e747ec6bc4b7c9a3e3b3c6a417aa3b9f3cf3ab295997eb4/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:43:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f61a3b3a24536c7c4e747ec6bc4b7c9a3e3b3c6a417aa3b9f3cf3ab295997eb4/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:43:51 compute-1 podman[88914]: 2025-12-06 09:43:51.427743619 +0000 UTC m=+0.028255892 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:43:51 compute-1 podman[88914]: 2025-12-06 09:43:51.530290546 +0000 UTC m=+0.130802749 container init 90da3924cacc3efa403ea45549b6824092497428ac34121bd882ee64a78e789c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Dec 06 09:43:51 compute-1 podman[88914]: 2025-12-06 09:43:51.544069588 +0000 UTC m=+0.144581771 container start 90da3924cacc3efa403ea45549b6824092497428ac34121bd882ee64a78e789c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid)
Dec 06 09:43:51 compute-1 bash[88914]: 90da3924cacc3efa403ea45549b6824092497428ac34121bd882ee64a78e789c
Dec 06 09:43:51 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:43:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:51 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 06 09:43:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:51 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 06 09:43:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:51.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:52.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:52 compute-1 python3.9[89098]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:43:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:52 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 06 09:43:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:52 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 06 09:43:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:52 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 06 09:43:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:52 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 06 09:43:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:52 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 06 09:43:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:52 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:43:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:43:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e103 e103: 3 total, 3 up, 3 in
Dec 06 09:43:53 compute-1 ceph-mon[79770]: 10.15 scrub starts
Dec 06 09:43:53 compute-1 ceph-mon[79770]: 10.15 scrub ok
Dec 06 09:43:53 compute-1 ceph-mon[79770]: 10.10 scrub ok
Dec 06 09:43:53 compute-1 ceph-mon[79770]: 10.9 scrub starts
Dec 06 09:43:53 compute-1 ceph-mon[79770]: pgmap v25: 337 pgs: 2 remapped+peering, 335 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:43:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "16"}]': finished
Dec 06 09:43:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Dec 06 09:43:53 compute-1 ceph-mon[79770]: 10.9 scrub ok
Dec 06 09:43:53 compute-1 ceph-mon[79770]: osdmap e102: 3 total, 3 up, 3 in
Dec 06 09:43:53 compute-1 ceph-mon[79770]: 6.5 scrub starts
Dec 06 09:43:53 compute-1 ceph-mon[79770]: 6.5 scrub ok
Dec 06 09:43:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.qhdjwa", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec 06 09:43:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 09:43:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:43:53 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 103 pg[6.f( v 50'39 lc 48'1 (0'0,50'39] local-lis/les=102/103 n=3 ec=54/21 lis/c=64/64 les/c/f=65/65/0 sis=102) [0] r=0 lpr=102 pi=[64,102)/1 crt=50'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:43:53 compute-1 python3.9[89275]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:43:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:43:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:53.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:43:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e104 e104: 3 total, 3 up, 3 in
Dec 06 09:43:53 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 104 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=101/102 n=5 ec=58/45 lis/c=101/79 les/c/f=102/80/0 sis=104 pruub=13.127288818s) [1] async=[1] r=-1 lpr=104 pi=[79,104)/1 crt=51'1027 mlcod 51'1027 active pruub 270.998352051s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:53 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 104 pg[10.1d( v 51'1027 (0'0,51'1027] local-lis/les=101/102 n=5 ec=58/45 lis/c=101/79 les/c/f=102/80/0 sis=104 pruub=13.127108574s) [1] r=-1 lpr=104 pi=[79,104)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 270.998352051s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:53 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 104 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=101/102 n=8 ec=58/45 lis/c=101/79 les/c/f=102/80/0 sis=104 pruub=13.113863945s) [1] async=[1] r=-1 lpr=104 pi=[79,104)/1 crt=51'1027 mlcod 51'1027 active pruub 270.985778809s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:43:53 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 104 pg[10.d( v 51'1027 (0'0,51'1027] local-lis/les=101/102 n=8 ec=58/45 lis/c=101/79 les/c/f=102/80/0 sis=104 pruub=13.113585472s) [1] r=-1 lpr=104 pi=[79,104)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 270.985778809s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:43:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:43:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:54.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:43:54 compute-1 ceph-mon[79770]: Reconfiguring mgr.compute-0.qhdjwa (monmap changed)...
Dec 06 09:43:54 compute-1 ceph-mon[79770]: Reconfiguring daemon mgr.compute-0.qhdjwa on compute-0
Dec 06 09:43:54 compute-1 ceph-mon[79770]: pgmap v27: 337 pgs: 2 remapped+peering, 335 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:43:54 compute-1 ceph-mon[79770]: osdmap e103: 3 total, 3 up, 3 in
Dec 06 09:43:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec 06 09:43:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:43:54 compute-1 ceph-mon[79770]: osdmap e104: 3 total, 3 up, 3 in
Dec 06 09:43:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:43:54 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e105 e105: 3 total, 3 up, 3 in
Dec 06 09:43:55 compute-1 sudo[89432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xulzkgiwfdggpzxcjditrsycvelmtpyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014234.6450326-334-76364663108491/AnsiballZ_setup.py'
Dec 06 09:43:55 compute-1 sudo[89432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:55 compute-1 ceph-mon[79770]: Reconfiguring crash.compute-0 (monmap changed)...
Dec 06 09:43:55 compute-1 ceph-mon[79770]: Reconfiguring daemon crash.compute-0 on compute-0
Dec 06 09:43:55 compute-1 ceph-mon[79770]: pgmap v29: 337 pgs: 1 active+recovering+remapped, 1 active+remapped, 1 peering, 334 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 5/223 objects misplaced (2.242%)
Dec 06 09:43:55 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:55 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:55 compute-1 ceph-mon[79770]: Reconfiguring osd.1 (monmap changed)...
Dec 06 09:43:55 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Dec 06 09:43:55 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:43:55 compute-1 ceph-mon[79770]: Reconfiguring daemon osd.1 on compute-0
Dec 06 09:43:55 compute-1 ceph-mon[79770]: osdmap e105: 3 total, 3 up, 3 in
Dec 06 09:43:55 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:55 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:55 compute-1 python3.9[89434]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:43:55 compute-1 sudo[89432]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:43:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:55.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:43:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:43:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:56.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:43:56 compute-1 sudo[89516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvrlbmcmtknrirzuhachogxgtagpifzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014234.6450326-334-76364663108491/AnsiballZ_dnf.py'
Dec 06 09:43:56 compute-1 sudo[89516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:43:56 compute-1 ceph-mon[79770]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Dec 06 09:43:56 compute-1 ceph-mon[79770]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Dec 06 09:43:56 compute-1 python3.9[89518]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:43:57 compute-1 ceph-mon[79770]: pgmap v32: 337 pgs: 1 active+recovering+remapped, 1 active+remapped, 1 peering, 334 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 5/223 objects misplaced (2.242%)
Dec 06 09:43:57 compute-1 ceph-mon[79770]: 10.1d scrub starts
Dec 06 09:43:57 compute-1 ceph-mon[79770]: 10.1d scrub ok
Dec 06 09:43:57 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:57 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:43:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:57.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:43:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:43:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:43:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:43:58.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:43:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e106 e106: 3 total, 3 up, 3 in
Dec 06 09:43:58 compute-1 ceph-mon[79770]: 6.d scrub starts
Dec 06 09:43:58 compute-1 ceph-mon[79770]: 6.d scrub ok
Dec 06 09:43:58 compute-1 ceph-mon[79770]: Reconfiguring grafana.compute-0 (dependencies changed)...
Dec 06 09:43:58 compute-1 ceph-mon[79770]: Reconfiguring daemon grafana.compute-0 on compute-0
Dec 06 09:43:58 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Dec 06 09:43:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:58 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:43:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:43:58 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:43:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e107 e107: 3 total, 3 up, 3 in
Dec 06 09:43:59 compute-1 sudo[89584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:43:59 compute-1 sudo[89584]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:59 compute-1 sudo[89584]: pam_unix(sudo:session): session closed for user root
Dec 06 09:43:59 compute-1 ceph-mon[79770]: pgmap v33: 337 pgs: 337 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:43:59 compute-1 ceph-mon[79770]: 10.7 scrub starts
Dec 06 09:43:59 compute-1 ceph-mon[79770]: 10.7 scrub ok
Dec 06 09:43:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Dec 06 09:43:59 compute-1 ceph-mon[79770]: osdmap e106: 3 total, 3 up, 3 in
Dec 06 09:43:59 compute-1 ceph-mon[79770]: osdmap e107: 3 total, 3 up, 3 in
Dec 06 09:43:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:43:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Dec 06 09:43:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:43:59 compute-1 sudo[89609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:43:59 compute-1 sudo[89609]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:43:59 compute-1 podman[89650]: 2025-12-06 09:43:59.725850537 +0000 UTC m=+0.051339782 container create 37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_edison, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 06 09:43:59 compute-1 systemd[1]: Started libpod-conmon-37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d.scope.
Dec 06 09:43:59 compute-1 podman[89650]: 2025-12-06 09:43:59.706306278 +0000 UTC m=+0.031795553 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:43:59 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:43:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:43:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:43:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:43:59.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:43:59 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e108 e108: 3 total, 3 up, 3 in
Dec 06 09:43:59 compute-1 podman[89650]: 2025-12-06 09:43:59.846443174 +0000 UTC m=+0.171932519 container init 37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_edison, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Dec 06 09:43:59 compute-1 podman[89650]: 2025-12-06 09:43:59.861384516 +0000 UTC m=+0.186873811 container start 37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_edison, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Dec 06 09:43:59 compute-1 podman[89650]: 2025-12-06 09:43:59.866296031 +0000 UTC m=+0.191785326 container attach 37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_edison, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:43:59 compute-1 frosty_edison[89667]: 167 167
Dec 06 09:43:59 compute-1 systemd[1]: libpod-37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d.scope: Deactivated successfully.
Dec 06 09:43:59 compute-1 conmon[89667]: conmon 37fe98652af2394c8044 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d.scope/container/memory.events
Dec 06 09:43:59 compute-1 podman[89650]: 2025-12-06 09:43:59.871860543 +0000 UTC m=+0.197349798 container died 37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_edison, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1)
Dec 06 09:43:59 compute-1 systemd[1]: var-lib-containers-storage-overlay-a369bfac63ba24ce3a211111e3ff78512d8d345c1c8cc876d5d4f7d91ce22fb4-merged.mount: Deactivated successfully.
Dec 06 09:43:59 compute-1 podman[89650]: 2025-12-06 09:43:59.913242819 +0000 UTC m=+0.238732074 container remove 37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=frosty_edison, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS)
Dec 06 09:43:59 compute-1 systemd[1]: libpod-conmon-37fe98652af2394c8044d3aff43021e8a09b95485a8991619a9fb9d6bc2c043d.scope: Deactivated successfully.
Dec 06 09:43:59 compute-1 sudo[89609]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:00.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:00 compute-1 ceph-mon[79770]: Reconfiguring crash.compute-1 (monmap changed)...
Dec 06 09:44:00 compute-1 ceph-mon[79770]: Reconfiguring daemon crash.compute-1 on compute-1
Dec 06 09:44:00 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Dec 06 09:44:00 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Dec 06 09:44:00 compute-1 ceph-mon[79770]: osdmap e108: 3 total, 3 up, 3 in
Dec 06 09:44:00 compute-1 sudo[89685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:44:00 compute-1 sudo[89685]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:44:00 compute-1 sudo[89685]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:00 compute-1 sudo[89710]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:44:00 compute-1 sudo[89710]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:44:00 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e109 e109: 3 total, 3 up, 3 in
Dec 06 09:44:00 compute-1 podman[89753]: 2025-12-06 09:44:00.963743658 +0000 UTC m=+0.056563984 container create 943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_torvalds, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Dec 06 09:44:01 compute-1 systemd[1]: Started libpod-conmon-943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47.scope.
Dec 06 09:44:01 compute-1 podman[89753]: 2025-12-06 09:44:00.936736909 +0000 UTC m=+0.029557315 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:44:01 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:44:01 compute-1 podman[89753]: 2025-12-06 09:44:01.065957937 +0000 UTC m=+0.158778343 container init 943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_torvalds, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325)
Dec 06 09:44:01 compute-1 podman[89753]: 2025-12-06 09:44:01.079100903 +0000 UTC m=+0.171921229 container start 943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_torvalds, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:44:01 compute-1 podman[89753]: 2025-12-06 09:44:01.082460308 +0000 UTC m=+0.175280724 container attach 943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_torvalds, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:44:01 compute-1 awesome_torvalds[89769]: 167 167
Dec 06 09:44:01 compute-1 systemd[1]: libpod-943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47.scope: Deactivated successfully.
Dec 06 09:44:01 compute-1 podman[89753]: 2025-12-06 09:44:01.088302497 +0000 UTC m=+0.181122883 container died 943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_torvalds, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 09:44:01 compute-1 systemd[1]: var-lib-containers-storage-overlay-414f91675d0f2345b476b78ddef5cde869913f88bf3577b859b96b96ad00ca4c-merged.mount: Deactivated successfully.
Dec 06 09:44:01 compute-1 podman[89753]: 2025-12-06 09:44:01.150963417 +0000 UTC m=+0.243783783 container remove 943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=awesome_torvalds, CEPH_REF=squid, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:44:01 compute-1 systemd[1]: libpod-conmon-943b0aaa17a4367da058ca27a36c4be50f678046646619f95b30232557802b47.scope: Deactivated successfully.
Dec 06 09:44:01 compute-1 sudo[89710]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:01 compute-1 ceph-mon[79770]: pgmap v36: 337 pgs: 337 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 5 B/s, 0 objects/s recovering
Dec 06 09:44:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:01 compute-1 ceph-mon[79770]: Reconfiguring osd.0 (monmap changed)...
Dec 06 09:44:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Dec 06 09:44:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:44:01 compute-1 ceph-mon[79770]: Reconfiguring daemon osd.0 on compute-1
Dec 06 09:44:01 compute-1 ceph-mon[79770]: osdmap e109: 3 total, 3 up, 3 in
Dec 06 09:44:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:01 compute-1 sudo[89796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:44:01 compute-1 sudo[89796]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:44:01 compute-1 sudo[89796]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:01 compute-1 sudo[89821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258
Dec 06 09:44:01 compute-1 sudo[89821]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:44:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:01.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:01 compute-1 podman[89862]: 2025-12-06 09:44:01.952053082 +0000 UTC m=+0.048354486 container create dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=happy_jemison, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 09:44:01 compute-1 systemd[1]: Started libpod-conmon-dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be.scope.
Dec 06 09:44:02 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:44:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:02.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:02 compute-1 podman[89862]: 2025-12-06 09:44:01.932742819 +0000 UTC m=+0.029044253 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:44:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e110 e110: 3 total, 3 up, 3 in
Dec 06 09:44:02 compute-1 podman[89862]: 2025-12-06 09:44:02.035680716 +0000 UTC m=+0.131982140 container init dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=happy_jemison, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 06 09:44:02 compute-1 podman[89862]: 2025-12-06 09:44:02.043129016 +0000 UTC m=+0.139430410 container start dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=happy_jemison, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Dec 06 09:44:02 compute-1 podman[89862]: 2025-12-06 09:44:02.046477051 +0000 UTC m=+0.142778455 container attach dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=happy_jemison, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 09:44:02 compute-1 happy_jemison[89878]: 167 167
Dec 06 09:44:02 compute-1 systemd[1]: libpod-dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be.scope: Deactivated successfully.
Dec 06 09:44:02 compute-1 podman[89862]: 2025-12-06 09:44:02.048927204 +0000 UTC m=+0.145228608 container died dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=happy_jemison, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:44:02 compute-1 systemd[1]: var-lib-containers-storage-overlay-1f482cac385e65d30b3ff90a2272b3416c895fc4d05d57a7f4b44cfc5ab5b3e7-merged.mount: Deactivated successfully.
Dec 06 09:44:02 compute-1 podman[89862]: 2025-12-06 09:44:02.196240714 +0000 UTC m=+0.292542118 container remove dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=happy_jemison, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 06 09:44:02 compute-1 systemd[1]: libpod-conmon-dadc92978cf546aaf08fc8c06c02e30483714e4ef7350cf9c57febfd630246be.scope: Deactivated successfully.
Dec 06 09:44:02 compute-1 sudo[89821]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:02 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:02 compute-1 ceph-mon[79770]: Reconfiguring mon.compute-1 (monmap changed)...
Dec 06 09:44:02 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec 06 09:44:02 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Dec 06 09:44:02 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:44:02 compute-1 ceph-mon[79770]: Reconfiguring daemon mon.compute-1 on compute-1
Dec 06 09:44:02 compute-1 ceph-mon[79770]: pgmap v39: 337 pgs: 337 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:44:02 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Dec 06 09:44:02 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Dec 06 09:44:02 compute-1 ceph-mon[79770]: osdmap e110: 3 total, 3 up, 3 in
Dec 06 09:44:02 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:02 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:02 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Dec 06 09:44:02 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Dec 06 09:44:02 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:44:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:44:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e111 e111: 3 total, 3 up, 3 in
Dec 06 09:44:03 compute-1 ceph-mon[79770]: Reconfiguring mon.compute-2 (monmap changed)...
Dec 06 09:44:03 compute-1 ceph-mon[79770]: Reconfiguring daemon mon.compute-2 on compute-2
Dec 06 09:44:03 compute-1 ceph-mon[79770]: osdmap e111: 3 total, 3 up, 3 in
Dec 06 09:44:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:03.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:44:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:04.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:44:04 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e112 e112: 3 total, 3 up, 3 in
Dec 06 09:44:04 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 112 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=112 pruub=10.954301834s) [2] r=-1 lpr=112 pi=[68,112)/1 crt=51'1027 mlcod 0'0 active pruub 279.412811279s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:44:04 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 112 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=112 pruub=10.954230309s) [2] r=-1 lpr=112 pi=[68,112)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 279.412811279s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:04 compute-1 ceph-mon[79770]: Reconfiguring mgr.compute-2.oazbvn (monmap changed)...
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.oazbvn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:44:04 compute-1 ceph-mon[79770]: Reconfiguring daemon mgr.compute-2.oazbvn on compute-2
Dec 06 09:44:04 compute-1 ceph-mon[79770]: pgmap v42: 337 pgs: 1 active+remapped, 336 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Dec 06 09:44:04 compute-1 ceph-mon[79770]: osdmap e112: 3 total, 3 up, 3 in
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 09:44:04 compute-1 sudo[89897]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:44:04 compute-1 sudo[89897]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:44:04 compute-1 sudo[89897]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:04 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd34c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:05 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3480013b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:05 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e113 e113: 3 total, 3 up, 3 in
Dec 06 09:44:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 113 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=113) [2]/[0] r=0 lpr=113 pi=[68,113)/1 crt=51'1027 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:44:05 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 113 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=68/69 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=113) [2]/[0] r=0 lpr=113 pi=[68,113)/1 crt=51'1027 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Dec 06 09:44:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:05.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:05 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd324000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:06.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:06 compute-1 ceph-mon[79770]: osdmap e113: 3 total, 3 up, 3 in
Dec 06 09:44:06 compute-1 ceph-mon[79770]: pgmap v45: 337 pgs: 1 active+remapped, 336 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Dec 06 09:44:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Dec 06 09:44:06 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e114 e114: 3 total, 3 up, 3 in
Dec 06 09:44:06 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 114 pg[10.13( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=114) [0] r=0 lpr=114 pi=[65,114)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:44:06 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 114 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=113/114 n=4 ec=58/45 lis/c=68/68 les/c/f=69/69/0 sis=113) [2]/[0] async=[2] r=0 lpr=113 pi=[68,113)/1 crt=51'1027 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:44:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:06 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd31c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094407 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:44:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:07 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd31c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:07 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Dec 06 09:44:07 compute-1 ceph-mon[79770]: osdmap e114: 3 total, 3 up, 3 in
Dec 06 09:44:07 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e115 e115: 3 total, 3 up, 3 in
Dec 06 09:44:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 115 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=113/114 n=4 ec=58/45 lis/c=113/68 les/c/f=114/69/0 sis=115 pruub=14.989471436s) [2] async=[2] r=-1 lpr=115 pi=[68,115)/1 crt=51'1027 mlcod 51'1027 active pruub 286.513122559s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:44:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 115 pg[10.12( v 51'1027 (0'0,51'1027] local-lis/les=113/114 n=4 ec=58/45 lis/c=113/68 les/c/f=114/69/0 sis=115 pruub=14.989409447s) [2] r=-1 lpr=115 pi=[68,115)/1 crt=51'1027 mlcod 0'0 unknown NOTIFY pruub 286.513122559s@ mbc={}] state<Start>: transitioning to Stray
Dec 06 09:44:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 115 pg[10.13( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=115) [0]/[2] r=-1 lpr=115 pi=[65,115)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:44:07 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 115 pg[10.13( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=65/65 les/c/f=66/66/0 sis=115) [0]/[2] r=-1 lpr=115 pi=[65,115)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:44:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:07.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:07 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:44:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:07 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3480020b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:08.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e116 e116: 3 total, 3 up, 3 in
Dec 06 09:44:08 compute-1 ceph-mon[79770]: osdmap e115: 3 total, 3 up, 3 in
Dec 06 09:44:08 compute-1 ceph-mon[79770]: pgmap v48: 337 pgs: 1 remapped+peering, 1 peering, 335 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s; 27 B/s, 0 objects/s recovering
Dec 06 09:44:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:08 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:09 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd31c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:09 compute-1 ceph-mon[79770]: osdmap e116: 3 total, 3 up, 3 in
Dec 06 09:44:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:44:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:44:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:44:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:44:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:44:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:44:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:09.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:09 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3280012e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:10.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:10 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e117 e117: 3 total, 3 up, 3 in
Dec 06 09:44:10 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 117 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=115/65 les/c/f=116/66/0 sis=117) [0] r=0 lpr=117 pi=[65,117)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:44:10 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 117 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=115/65 les/c/f=116/66/0 sis=117) [0] r=0 lpr=117 pi=[65,117)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:44:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:10 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3280012e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:11 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e118 e118: 3 total, 3 up, 3 in
Dec 06 09:44:11 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 118 pg[10.13( v 51'1027 (0'0,51'1027] local-lis/les=117/118 n=5 ec=58/45 lis/c=115/65 les/c/f=116/66/0 sis=117) [0] r=0 lpr=117 pi=[65,117)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:44:11 compute-1 ceph-mon[79770]: pgmap v50: 337 pgs: 1 remapped+peering, 1 peering, 335 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 234 B/s rd, 0 B/s wr, 0 op/s; 25 B/s, 0 objects/s recovering
Dec 06 09:44:11 compute-1 ceph-mon[79770]: osdmap e117: 3 total, 3 up, 3 in
Dec 06 09:44:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:11 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3280012e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:11.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:11 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:12.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:12 compute-1 ceph-mon[79770]: osdmap e118: 3 total, 3 up, 3 in
Dec 06 09:44:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:12 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:12 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Dec 06 09:44:13 compute-1 ceph-mon[79770]: pgmap v53: 337 pgs: 1 remapped+peering, 1 peering, 335 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 237 B/s rd, 0 B/s wr, 0 op/s; 25 B/s, 0 objects/s recovering
Dec 06 09:44:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:13 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:13.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:13 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd328002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:14.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:14 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e119 e119: 3 total, 3 up, 3 in
Dec 06 09:44:14 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 119 pg[10.14( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=119) [0] r=0 lpr=119 pi=[74,119)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:44:14 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Dec 06 09:44:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:14 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd318000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:15 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:15 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e120 e120: 3 total, 3 up, 3 in
Dec 06 09:44:15 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 120 pg[10.14( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=120) [0]/[2] r=-1 lpr=120 pi=[74,120)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:44:15 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 120 pg[10.14( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=74/74 les/c/f=75/75/0 sis=120) [0]/[2] r=-1 lpr=120 pi=[74,120)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:44:15 compute-1 ceph-mon[79770]: pgmap v54: 337 pgs: 337 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s; 18 B/s, 0 objects/s recovering
Dec 06 09:44:15 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Dec 06 09:44:15 compute-1 ceph-mon[79770]: osdmap e119: 3 total, 3 up, 3 in
Dec 06 09:44:15 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:15 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:44:15 compute-1 sudo[89990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:44:15 compute-1 sudo[89990]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:44:15 compute-1 sudo[89990]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:15.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:15 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:16.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:16 compute-1 ceph-mon[79770]: osdmap e120: 3 total, 3 up, 3 in
Dec 06 09:44:16 compute-1 ceph-mon[79770]: pgmap v57: 337 pgs: 337 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 357 B/s rd, 0 op/s; 19 B/s, 0 objects/s recovering
Dec 06 09:44:16 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Dec 06 09:44:16 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e121 e121: 3 total, 3 up, 3 in
Dec 06 09:44:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:16 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd328002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:17 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:17 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Dec 06 09:44:17 compute-1 ceph-mon[79770]: osdmap e121: 3 total, 3 up, 3 in
Dec 06 09:44:17 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e122 e122: 3 total, 3 up, 3 in
Dec 06 09:44:17 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 122 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=120/74 les/c/f=121/75/0 sis=122) [0] r=0 lpr=122 pi=[74,122)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:44:17 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 122 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=5 ec=58/45 lis/c=120/74 les/c/f=121/75/0 sis=122) [0] r=0 lpr=122 pi=[74,122)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:44:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:17.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:17 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:44:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:17 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec 06 09:44:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:18.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec 06 09:44:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e123 e123: 3 total, 3 up, 3 in
Dec 06 09:44:18 compute-1 ceph-mon[79770]: osdmap e122: 3 total, 3 up, 3 in
Dec 06 09:44:18 compute-1 ceph-mon[79770]: pgmap v60: 337 pgs: 1 active+remapped, 336 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s; 0 B/s, 1 objects/s recovering
Dec 06 09:44:18 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Dec 06 09:44:18 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 123 pg[10.14( v 51'1027 (0'0,51'1027] local-lis/les=122/123 n=5 ec=58/45 lis/c=120/74 les/c/f=121/75/0 sis=122) [0] r=0 lpr=122 pi=[74,122)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:44:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:18 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:19 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd328002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:19.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:19 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:20.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:20 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:21 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:21 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Dec 06 09:44:21 compute-1 ceph-mon[79770]: osdmap e123: 3 total, 3 up, 3 in
Dec 06 09:44:21 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e124 e124: 3 total, 3 up, 3 in
Dec 06 09:44:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:21.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:21 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd328003a70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:22.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:22 compute-1 ceph-mon[79770]: pgmap v62: 337 pgs: 1 active+remapped, 336 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 234 B/s rd, 0 op/s; 0 B/s, 1 objects/s recovering
Dec 06 09:44:22 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Dec 06 09:44:22 compute-1 ceph-mon[79770]: pgmap v63: 337 pgs: 1 active+remapped, 336 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s; 0 B/s, 0 objects/s recovering
Dec 06 09:44:22 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Dec 06 09:44:22 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Dec 06 09:44:22 compute-1 ceph-mon[79770]: osdmap e124: 3 total, 3 up, 3 in
Dec 06 09:44:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:22 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:22 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e125 e125: 3 total, 3 up, 3 in
Dec 06 09:44:22 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:44:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:23 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:23 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Dec 06 09:44:23 compute-1 ceph-mon[79770]: osdmap e125: 3 total, 3 up, 3 in
Dec 06 09:44:23 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Dec 06 09:44:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:23.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e126 e126: 3 total, 3 up, 3 in
Dec 06 09:44:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:23 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:24.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:24 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd328003a70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:24 compute-1 ceph-mon[79770]: pgmap v66: 337 pgs: 337 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 09:44:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Dec 06 09:44:24 compute-1 sudo[90049]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:44:24 compute-1 ceph-mon[79770]: osdmap e126: 3 total, 3 up, 3 in
Dec 06 09:44:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:44:24 compute-1 sudo[90049]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:44:24 compute-1 sudo[90049]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:25 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd318002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:25.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Dec 06 09:44:25 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e127 e127: 3 total, 3 up, 3 in
Dec 06 09:44:25 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 127 pg[10.19( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=89/89 les/c/f=90/90/0 sis=127) [0] r=0 lpr=127 pi=[89,127)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:44:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:25 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:26.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[88952]: 06/12/2025 09:44:26 : epoch 6933fad7 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd3240016a0 fd 38 proxy ignored for local
Dec 06 09:44:26 compute-1 kernel: ganesha.nfsd[89936]: segfault at 50 ip 00007fd3f967032e sp 00007fd3b17f9210 error 4 in libntirpc.so.5.8[7fd3f9655000+2c000] likely on CPU 1 (core 0, socket 1)
Dec 06 09:44:26 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 06 09:44:26 compute-1 systemd[1]: Started Process Core Dump (PID 90075/UID 0).
Dec 06 09:44:26 compute-1 ceph-mon[79770]: pgmap v68: 337 pgs: 337 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 09:44:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Dec 06 09:44:26 compute-1 ceph-mon[79770]: osdmap e127: 3 total, 3 up, 3 in
Dec 06 09:44:26 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e128 e128: 3 total, 3 up, 3 in
Dec 06 09:44:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 128 pg[10.19( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=89/89 les/c/f=90/90/0 sis=128) [0]/[1] r=-1 lpr=128 pi=[89,128)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:44:27 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 128 pg[10.19( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=89/89 les/c/f=90/90/0 sis=128) [0]/[1] r=-1 lpr=128 pi=[89,128)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:44:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:27.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:44:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e129 e129: 3 total, 3 up, 3 in
Dec 06 09:44:28 compute-1 ceph-mon[79770]: osdmap e128: 3 total, 3 up, 3 in
Dec 06 09:44:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec 06 09:44:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:28.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec 06 09:44:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e130 e130: 3 total, 3 up, 3 in
Dec 06 09:44:28 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 130 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=7 ec=58/45 lis/c=128/89 les/c/f=129/90/0 sis=130) [0] r=0 lpr=130 pi=[89,130)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:44:28 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 130 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=7 ec=58/45 lis/c=128/89 les/c/f=129/90/0 sis=130) [0] r=0 lpr=130 pi=[89,130)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:44:29 compute-1 ceph-mon[79770]: pgmap v71: 337 pgs: 1 remapped+peering, 336 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 207 B/s rd, 0 op/s
Dec 06 09:44:29 compute-1 ceph-mon[79770]: osdmap e129: 3 total, 3 up, 3 in
Dec 06 09:44:29 compute-1 ceph-mon[79770]: osdmap e130: 3 total, 3 up, 3 in
Dec 06 09:44:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:29.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e131 e131: 3 total, 3 up, 3 in
Dec 06 09:44:29 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 131 pg[10.19( v 51'1027 (0'0,51'1027] local-lis/les=130/131 n=7 ec=58/45 lis/c=128/89 les/c/f=129/90/0 sis=130) [0] r=0 lpr=130 pi=[89,130)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:44:30 compute-1 systemd-coredump[90076]: Process 88957 (ganesha.nfsd) of user 0 dumped core.
                                                   
                                                   Stack trace of thread 54:
                                                   #0  0x00007fd3f967032e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                   ELF object binary architecture: AMD x86-64
Dec 06 09:44:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:30.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:30 compute-1 systemd[1]: systemd-coredump@1-90075-0.service: Deactivated successfully.
Dec 06 09:44:30 compute-1 systemd[1]: systemd-coredump@1-90075-0.service: Consumed 3.160s CPU time.
Dec 06 09:44:30 compute-1 podman[90082]: 2025-12-06 09:44:30.235684416 +0000 UTC m=+0.042545541 container died 90da3924cacc3efa403ea45549b6824092497428ac34121bd882ee64a78e789c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Dec 06 09:44:30 compute-1 systemd[1]: var-lib-containers-storage-overlay-f61a3b3a24536c7c4e747ec6bc4b7c9a3e3b3c6a417aa3b9f3cf3ab295997eb4-merged.mount: Deactivated successfully.
Dec 06 09:44:30 compute-1 podman[90082]: 2025-12-06 09:44:30.396969354 +0000 UTC m=+0.203830449 container remove 90da3924cacc3efa403ea45549b6824092497428ac34121bd882ee64a78e789c (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:44:30 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec 06 09:44:30 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec 06 09:44:30 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.668s CPU time.
Dec 06 09:44:31 compute-1 ceph-mon[79770]: pgmap v74: 337 pgs: 1 remapped+peering, 336 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:44:31 compute-1 ceph-mon[79770]: osdmap e131: 3 total, 3 up, 3 in
Dec 06 09:44:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:31.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:32.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:44:33 compute-1 ceph-mon[79770]: pgmap v76: 337 pgs: 1 remapped+peering, 336 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 213 B/s rd, 0 op/s
Dec 06 09:44:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:33.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:34.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e132 e132: 3 total, 3 up, 3 in
Dec 06 09:44:34 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Dec 06 09:44:35 compute-1 ceph-mon[79770]: pgmap v77: 337 pgs: 337 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s; 18 B/s, 1 objects/s recovering
Dec 06 09:44:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Dec 06 09:44:35 compute-1 ceph-mon[79770]: osdmap e132: 3 total, 3 up, 3 in
Dec 06 09:44:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094435 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:44:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:35.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:36.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:36 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Dec 06 09:44:36 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e133 e133: 3 total, 3 up, 3 in
Dec 06 09:44:36 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 133 pg[10.1b( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=97/97 les/c/f=98/98/0 sis=133) [0] r=0 lpr=133 pi=[97,133)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:44:37 compute-1 ceph-mon[79770]: pgmap v79: 337 pgs: 337 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 294 B/s rd, 0 op/s; 15 B/s, 1 objects/s recovering
Dec 06 09:44:37 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Dec 06 09:44:37 compute-1 ceph-mon[79770]: osdmap e133: 3 total, 3 up, 3 in
Dec 06 09:44:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e134 e134: 3 total, 3 up, 3 in
Dec 06 09:44:37 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 134 pg[10.1b( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=97/97 les/c/f=98/98/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[97,134)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:44:37 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 134 pg[10.1b( empty local-lis/les=0/0 n=0 ec=58/45 lis/c=97/97 les/c/f=98/98/0 sis=134) [0]/[1] r=-1 lpr=134 pi=[97,134)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Dec 06 09:44:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:37.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:44:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:38.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:38 compute-1 ceph-mon[79770]: osdmap e134: 3 total, 3 up, 3 in
Dec 06 09:44:38 compute-1 ceph-mon[79770]: pgmap v82: 337 pgs: 1 remapped+peering, 336 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s; 18 B/s, 1 objects/s recovering
Dec 06 09:44:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e135 e135: 3 total, 3 up, 3 in
Dec 06 09:44:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e136 e136: 3 total, 3 up, 3 in
Dec 06 09:44:39 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 136 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=2 ec=58/45 lis/c=134/97 les/c/f=135/98/0 sis=136) [0] r=0 lpr=136 pi=[97,136)/1 luod=0'0 crt=51'1027 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Dec 06 09:44:39 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 136 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=0/0 n=2 ec=58/45 lis/c=134/97 les/c/f=135/98/0 sis=136) [0] r=0 lpr=136 pi=[97,136)/1 crt=51'1027 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Dec 06 09:44:39 compute-1 ceph-mon[79770]: osdmap e135: 3 total, 3 up, 3 in
Dec 06 09:44:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:44:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:39.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:40.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:40 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 2.
Dec 06 09:44:40 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:44:40 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.668s CPU time.
Dec 06 09:44:40 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:44:40 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e137 e137: 3 total, 3 up, 3 in
Dec 06 09:44:40 compute-1 ceph-osd[77465]: osd.0 pg_epoch: 137 pg[10.1b( v 51'1027 (0'0,51'1027] local-lis/les=136/137 n=2 ec=58/45 lis/c=134/97 les/c/f=135/98/0 sis=136) [0] r=0 lpr=136 pi=[97,136)/1 crt=51'1027 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Dec 06 09:44:40 compute-1 ceph-mon[79770]: osdmap e136: 3 total, 3 up, 3 in
Dec 06 09:44:40 compute-1 ceph-mon[79770]: pgmap v85: 337 pgs: 1 remapped+peering, 336 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:44:41 compute-1 podman[90180]: 2025-12-06 09:44:41.013563207 +0000 UTC m=+0.107260239 container create 490bcdc1ddf2a147605f7bef7763287ae9d25da8b09ab41fcfcd1cec65c24755 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 09:44:41 compute-1 podman[90180]: 2025-12-06 09:44:40.931822695 +0000 UTC m=+0.025519777 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:44:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bbbd98a88994f54839b8379f302a87baf27efd11c17b9c4f84aad6e60a7f0d8/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 06 09:44:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bbbd98a88994f54839b8379f302a87baf27efd11c17b9c4f84aad6e60a7f0d8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:44:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bbbd98a88994f54839b8379f302a87baf27efd11c17b9c4f84aad6e60a7f0d8/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:44:41 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bbbd98a88994f54839b8379f302a87baf27efd11c17b9c4f84aad6e60a7f0d8/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:44:41 compute-1 podman[90180]: 2025-12-06 09:44:41.094232492 +0000 UTC m=+0.187929544 container init 490bcdc1ddf2a147605f7bef7763287ae9d25da8b09ab41fcfcd1cec65c24755 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:44:41 compute-1 podman[90180]: 2025-12-06 09:44:41.100138716 +0000 UTC m=+0.193835748 container start 490bcdc1ddf2a147605f7bef7763287ae9d25da8b09ab41fcfcd1cec65c24755 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Dec 06 09:44:41 compute-1 bash[90180]: 490bcdc1ddf2a147605f7bef7763287ae9d25da8b09ab41fcfcd1cec65c24755
Dec 06 09:44:41 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:44:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 06 09:44:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 06 09:44:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 06 09:44:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 06 09:44:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 06 09:44:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 06 09:44:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 06 09:44:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:44:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:41.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:42 compute-1 ceph-mon[79770]: osdmap e137: 3 total, 3 up, 3 in
Dec 06 09:44:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:42.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:42 compute-1 sudo[89516]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:44:43 compute-1 ceph-mon[79770]: pgmap v87: 337 pgs: 1 remapped+peering, 336 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:44:43 compute-1 sudo[90388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdobhnpqqtntrfdqnvivkrhhyejftogk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014283.0188894-370-259929612551808/AnsiballZ_command.py'
Dec 06 09:44:43 compute-1 sudo[90388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:43 compute-1 python3.9[90390]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:44:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:43.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:44.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:44 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Dec 06 09:44:44 compute-1 sudo[90388]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:44 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e138 e138: 3 total, 3 up, 3 in
Dec 06 09:44:44 compute-1 sudo[90551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:44:44 compute-1 sudo[90551]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:44:44 compute-1 sudo[90551]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:45 compute-1 ceph-mon[79770]: pgmap v88: 337 pgs: 337 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 170 B/s wr, 1 op/s; 18 B/s, 0 objects/s recovering
Dec 06 09:44:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Dec 06 09:44:45 compute-1 ceph-mon[79770]: osdmap e138: 3 total, 3 up, 3 in
Dec 06 09:44:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec 06 09:44:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:45.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec 06 09:44:46 compute-1 sudo[90701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqldgqtgorimlzgmknnkwscbsfvihdbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014285.4023528-395-268993395166545/AnsiballZ_selinux.py'
Dec 06 09:44:46 compute-1 sudo[90701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:46.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:46 compute-1 python3.9[90703]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 06 09:44:46 compute-1 sudo[90701]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:46 compute-1 ceph-mon[79770]: pgmap v90: 337 pgs: 337 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 829 B/s rd, 165 B/s wr, 1 op/s; 17 B/s, 0 objects/s recovering
Dec 06 09:44:46 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Dec 06 09:44:46 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e139 e139: 3 total, 3 up, 3 in
Dec 06 09:44:47 compute-1 sudo[90854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgcbpnrybrzpnhaypgrtafruuvuahudy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014286.8703115-427-105720878107126/AnsiballZ_command.py'
Dec 06 09:44:47 compute-1 sudo[90854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:47 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:44:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:47 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:44:47 compute-1 python3.9[90856]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 06 09:44:47 compute-1 sudo[90854]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Dec 06 09:44:47 compute-1 ceph-mon[79770]: osdmap e139: 3 total, 3 up, 3 in
Dec 06 09:44:47 compute-1 sudo[91006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whzkaxwsdlzzqzffcjfghpdhjikyvcxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014287.6105683-451-111292438292839/AnsiballZ_file.py'
Dec 06 09:44:47 compute-1 sudo[91006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:44:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:47.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:48 compute-1 python3.9[91008]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:48.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:48 compute-1 sudo[91006]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:48 compute-1 ceph-mon[79770]: pgmap v92: 337 pgs: 337 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 4.3 KiB/s rd, 1.6 KiB/s wr, 5 op/s; 15 B/s, 0 objects/s recovering
Dec 06 09:44:48 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Dec 06 09:44:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e140 e140: 3 total, 3 up, 3 in
Dec 06 09:44:48 compute-1 sudo[91159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azfwzxakdifhclmjktbnjnfnactxrmlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014288.3312435-475-81074487567573/AnsiballZ_mount.py'
Dec 06 09:44:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e141 e141: 3 total, 3 up, 3 in
Dec 06 09:44:48 compute-1 sudo[91159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:49 compute-1 python3.9[91161]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 06 09:44:49 compute-1 sudo[91159]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:49 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Dec 06 09:44:49 compute-1 ceph-mon[79770]: osdmap e140: 3 total, 3 up, 3 in
Dec 06 09:44:49 compute-1 ceph-mon[79770]: osdmap e141: 3 total, 3 up, 3 in
Dec 06 09:44:49 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e142 e142: 3 total, 3 up, 3 in
Dec 06 09:44:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:49.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:50.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:50 compute-1 sudo[91312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azlwhknjwwkjdfwrxtriyddsdyqtvwxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014290.0178149-559-228424179317805/AnsiballZ_file.py'
Dec 06 09:44:50 compute-1 sudo[91312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:50 compute-1 ceph-mon[79770]: pgmap v95: 337 pgs: 337 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 4.6 KiB/s rd, 1.9 KiB/s wr, 6 op/s
Dec 06 09:44:50 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Dec 06 09:44:50 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Dec 06 09:44:50 compute-1 ceph-mon[79770]: osdmap e142: 3 total, 3 up, 3 in
Dec 06 09:44:50 compute-1 python3.9[91314]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:44:50 compute-1 sudo[91312]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:50 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e143 e143: 3 total, 3 up, 3 in
Dec 06 09:44:51 compute-1 sudo[91464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fybizybehwwdxqdxgoiydwahbkbmkyot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014290.7966628-583-228080158207838/AnsiballZ_stat.py'
Dec 06 09:44:51 compute-1 sudo[91464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:51 compute-1 python3.9[91466]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:44:51 compute-1 sudo[91464]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:51 compute-1 sudo[91542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfkbsgmlzrsybhyjcvxsixlmeaspiyez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014290.7966628-583-228080158207838/AnsiballZ_file.py'
Dec 06 09:44:51 compute-1 sudo[91542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:51 compute-1 python3.9[91544]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:44:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:51.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:51 compute-1 sudo[91542]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e144 e144: 3 total, 3 up, 3 in
Dec 06 09:44:52 compute-1 ceph-mon[79770]: osdmap e143: 3 total, 3 up, 3 in
Dec 06 09:44:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:52.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:44:53 compute-1 sudo[91695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-indfpkdlywmdbjppapgaevpphbgnkjkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014292.7148964-646-67587092476814/AnsiballZ_stat.py'
Dec 06 09:44:53 compute-1 sudo[91695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:53 compute-1 ceph-mon[79770]: pgmap v98: 337 pgs: 337 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail
Dec 06 09:44:53 compute-1 ceph-mon[79770]: osdmap e144: 3 total, 3 up, 3 in
Dec 06 09:44:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e145 e145: 3 total, 3 up, 3 in
Dec 06 09:44:53 compute-1 python3.9[91697]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:44:53 compute-1 sudo[91695]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 09:44:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:53.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:54 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 e146: 3 total, 3 up, 3 in
Dec 06 09:44:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:54.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:54 compute-1 ceph-mon[79770]: osdmap e145: 3 total, 3 up, 3 in
Dec 06 09:44:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:44:54 compute-1 sudo[91866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccqubdvksevpftqrrmlrmepowvamefbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014293.9537287-685-17291125860647/AnsiballZ_getent.py'
Dec 06 09:44:54 compute-1 sudo[91866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:54 compute-1 python3.9[91868]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 06 09:44:54 compute-1 sudo[91866]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:54 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:55 compute-1 ceph-mon[79770]: pgmap v101: 337 pgs: 1 activating+remapped, 336 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 255 B/s wr, 1 op/s; 5/224 objects misplaced (2.232%); 0 B/s, 1 objects/s recovering
Dec 06 09:44:55 compute-1 ceph-mon[79770]: osdmap e146: 3 total, 3 up, 3 in
Dec 06 09:44:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:55 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:55 compute-1 sudo[92019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luqnmoicsedllhsjtumczoaedfiiusvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014295.1172934-715-85810110210147/AnsiballZ_getent.py'
Dec 06 09:44:55 compute-1 sudo[92019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:55 compute-1 python3.9[92021]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 06 09:44:55 compute-1 sudo[92019]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:55.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:55 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:56.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:56 compute-1 ceph-mon[79770]: mgrmap e35: compute-0.qhdjwa(active, since 92s), standbys: compute-1.sauzid, compute-2.oazbvn
Dec 06 09:44:56 compute-1 sudo[92173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfqwkppnlknwikqasilefolranccinwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014295.915885-739-110300553343170/AnsiballZ_group.py'
Dec 06 09:44:56 compute-1 sudo[92173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:56 compute-1 python3.9[92175]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 06 09:44:56 compute-1 sudo[92173]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:56 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:57 compute-1 ceph-mon[79770]: pgmap v103: 337 pgs: 1 activating+remapped, 336 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 413 B/s rd, 206 B/s wr, 1 op/s; 5/224 objects misplaced (2.232%); 0 B/s, 1 objects/s recovering
Dec 06 09:44:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094457 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:44:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:57 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:57 compute-1 sudo[92325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzvioocoxgocmqhrhknkxtulmecrepln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014296.9762013-766-19399101583846/AnsiballZ_file.py'
Dec 06 09:44:57 compute-1 sudo[92325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:57 compute-1 python3.9[92327]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 06 09:44:57 compute-1 sudo[92325]: pam_unix(sudo:session): session closed for user root
Dec 06 09:44:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:44:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:57.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:57 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:44:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:44:58.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:44:58 compute-1 ceph-mon[79770]: pgmap v104: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 682 B/s wr, 1 op/s; 18 B/s, 1 objects/s recovering
Dec 06 09:44:58 compute-1 sudo[92478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqasmirwvbcjhcdfzlteiokfvojxuzwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014298.2028131-799-135680649157170/AnsiballZ_dnf.py'
Dec 06 09:44:58 compute-1 sudo[92478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:44:58 compute-1 python3.9[92480]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:44:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:58 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1800016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:59 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:44:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:44:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:44:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:44:59.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:44:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:44:59 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:45:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:00.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:45:00 compute-1 sudo[92478]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:00 compute-1 sudo[92632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdmqhegpmgvzxzppnaglrwpywijdnvnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014300.5513642-823-6580280650335/AnsiballZ_file.py'
Dec 06 09:45:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:00 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:00 compute-1 sudo[92632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:01 compute-1 python3.9[92634]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:01 compute-1 sudo[92632]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:01 compute-1 ceph-mon[79770]: pgmap v105: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 397 B/s rd, 529 B/s wr, 1 op/s; 14 B/s, 1 objects/s recovering
Dec 06 09:45:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:01 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1800016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:01 compute-1 sudo[92784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptfxspjcquiryqlmpkaezzzvnfwfzndk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014301.3141398-847-280964911018330/AnsiballZ_stat.py'
Dec 06 09:45:01 compute-1 sudo[92784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:01 compute-1 python3.9[92786]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:01 compute-1 sudo[92784]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:01.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:01 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:02 compute-1 sudo[92862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvahaiazfmrepoysdjxflpthdmbdlefe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014301.3141398-847-280964911018330/AnsiballZ_file.py'
Dec 06 09:45:02 compute-1 sudo[92862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000027s ======
Dec 06 09:45:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:02.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Dec 06 09:45:02 compute-1 python3.9[92864]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:02 compute-1 sudo[92862]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:02 compute-1 ceph-mon[79770]: pgmap v106: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 117 B/s rd, 353 B/s wr, 0 op/s; 12 B/s, 0 objects/s recovering
Dec 06 09:45:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:02 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:45:02 compute-1 sudo[93015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjtccqpafznqpqfowmoanenwvftiepfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014302.6618762-886-135836154265672/AnsiballZ_stat.py'
Dec 06 09:45:02 compute-1 sudo[93015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:03 compute-1 python3.9[93017]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:45:03 compute-1 sudo[93015]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:03 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:03 compute-1 sudo[93093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kueyhcgobaejgcjjtnnfymuuakaqfcjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014302.6618762-886-135836154265672/AnsiballZ_file.py'
Dec 06 09:45:03 compute-1 sudo[93093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:03 compute-1 python3.9[93095]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:45:03 compute-1 sudo[93093]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:03.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:03 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1800016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:04.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:04 compute-1 sudo[93246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjzgdhcckbkcjhyzzhqxdfxkyceopoib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014304.2653904-931-233770971051416/AnsiballZ_dnf.py'
Dec 06 09:45:04 compute-1 sudo[93246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:04 compute-1 python3.9[93248]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:45:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:04 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:04 compute-1 ceph-mon[79770]: pgmap v107: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 307 B/s rd, 307 B/s wr, 0 op/s; 10 B/s, 0 objects/s recovering
Dec 06 09:45:05 compute-1 sudo[93250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:45:05 compute-1 sudo[93250]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:45:05 compute-1 sudo[93250]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:05 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:05.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:05 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:06.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:06 compute-1 sudo[93246]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:06 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:07 compute-1 ceph-mon[79770]: pgmap v108: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 262 B/s rd, 262 B/s wr, 0 op/s; 9 B/s, 0 objects/s recovering
Dec 06 09:45:07 compute-1 python3.9[93425]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:07 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:07 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:45:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:07.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:07 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c003340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:08.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:08 compute-1 python3.9[93577]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 06 09:45:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:08 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:08 compute-1 python3.9[93728]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:09 compute-1 ceph-mon[79770]: pgmap v109: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 255 B/s wr, 0 op/s; 9 B/s, 0 objects/s recovering
Dec 06 09:45:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:45:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:09 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:45:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:09.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:45:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:09 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:10.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:10 compute-1 sudo[93878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujoazzrexqxbgmzrkhysqulqrdcwwwab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014309.5259006-1054-250812418759149/AnsiballZ_systemd.py'
Dec 06 09:45:10 compute-1 sudo[93878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:10 compute-1 python3.9[93881]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:45:10 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 06 09:45:10 compute-1 ceph-mon[79770]: pgmap v110: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:45:10 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Dec 06 09:45:10 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 06 09:45:10 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 06 09:45:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:10 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c003340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:11 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 06 09:45:11 compute-1 sudo[93878]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:11 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c003340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:11 compute-1 python3.9[94043]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 06 09:45:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:11.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:11 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:12.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:12 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:12 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:45:13 compute-1 ceph-mon[79770]: pgmap v111: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:45:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:13 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198001fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:45:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:13.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:45:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:13 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:14.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:14 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:15 compute-1 sudo[94195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyfeapjljjfrkxsuivindaskodsprsxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014314.7117236-1225-189926425846133/AnsiballZ_systemd.py'
Dec 06 09:45:15 compute-1 sudo[94195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:15 compute-1 ceph-mon[79770]: pgmap v112: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 09:45:15 compute-1 python3.9[94197]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:45:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:15 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:15 compute-1 sudo[94195]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:15 compute-1 sudo[94299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:45:15 compute-1 sudo[94299]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:45:15 compute-1 sudo[94299]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:15 compute-1 sudo[94348]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:45:15 compute-1 sudo[94348]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:45:15 compute-1 sudo[94399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnsbxnfowdxhjflnqgbbbsxgwuxlsjru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014315.5521152-1225-38483324900051/AnsiballZ_systemd.py'
Dec 06 09:45:15 compute-1 sudo[94399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:15.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:15 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b00021f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:16.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:16 compute-1 python3.9[94401]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:45:16 compute-1 sudo[94399]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:16 compute-1 sudo[94348]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:16 compute-1 sshd-session[87391]: Connection closed by 192.168.122.30 port 38118
Dec 06 09:45:16 compute-1 sshd-session[87387]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:45:16 compute-1 systemd[1]: session-38.scope: Deactivated successfully.
Dec 06 09:45:16 compute-1 systemd[1]: session-38.scope: Consumed 1min 11.601s CPU time.
Dec 06 09:45:16 compute-1 systemd-logind[788]: Session 38 logged out. Waiting for processes to exit.
Dec 06 09:45:16 compute-1 systemd-logind[788]: Removed session 38.
Dec 06 09:45:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:16 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:17 compute-1 ceph-mon[79770]: pgmap v113: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:45:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:17 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:17 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:45:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:17.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:17 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094518 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:45:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:45:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:18.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:45:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:18 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b0002390 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:19 compute-1 ceph-mon[79770]: pgmap v114: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 09:45:19 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:45:19 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:45:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:19 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:19.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:19 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:20.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:20 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:45:20 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:45:20 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:45:20 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:45:20 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:45:20 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:45:20 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:45:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:20 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:21 compute-1 ceph-mon[79770]: pgmap v115: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:45:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:21 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:21.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:21 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:22.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:22 compute-1 sshd-session[94463]: Accepted publickey for zuul from 192.168.122.30 port 35380 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:45:22 compute-1 systemd-logind[788]: New session 39 of user zuul.
Dec 06 09:45:22 compute-1 systemd[1]: Started Session 39 of User zuul.
Dec 06 09:45:22 compute-1 sshd-session[94463]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:45:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:22 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:22 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:45:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:23 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b00091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:23 compute-1 ceph-mon[79770]: pgmap v116: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:45:23 compute-1 python3.9[94616]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:45:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:23.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:23 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:45:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:24.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.472824) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324473031, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2961, "num_deletes": 252, "total_data_size": 10711815, "memory_usage": 11054624, "flush_reason": "Manual Compaction"}
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324519492, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 6722042, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7705, "largest_seqno": 10661, "table_properties": {"data_size": 6708894, "index_size": 8490, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3589, "raw_key_size": 31688, "raw_average_key_size": 22, "raw_value_size": 6680720, "raw_average_value_size": 4688, "num_data_blocks": 370, "num_entries": 1425, "num_filter_entries": 1425, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014196, "oldest_key_time": 1765014196, "file_creation_time": 1765014324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 46733 microseconds, and 21090 cpu microseconds.
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.519609) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 6722042 bytes OK
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.519653) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.521352) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.521382) EVENT_LOG_v1 {"time_micros": 1765014324521374, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.521403) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 10697508, prev total WAL file size 10715538, number of live WAL files 2.
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.524707) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(6564KB)], [18(12MB)]
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324525081, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 19465518, "oldest_snapshot_seqno": -1}
Dec 06 09:45:24 compute-1 ceph-mon[79770]: pgmap v117: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 09:45:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:45:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:45:24 compute-1 sudo[94696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:45:24 compute-1 sudo[94696]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:45:24 compute-1 sudo[94696]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4125 keys, 14793424 bytes, temperature: kUnknown
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324739190, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 14793424, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14759695, "index_size": 22291, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10373, "raw_key_size": 105206, "raw_average_key_size": 25, "raw_value_size": 14678043, "raw_average_value_size": 3558, "num_data_blocks": 957, "num_entries": 4125, "num_filter_entries": 4125, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765014324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.739571) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 14793424 bytes
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.742194) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 90.9 rd, 69.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(6.4, 12.2 +0.0 blob) out(14.1 +0.0 blob), read-write-amplify(5.1) write-amplify(2.2) OK, records in: 4661, records dropped: 536 output_compression: NoCompression
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.742224) EVENT_LOG_v1 {"time_micros": 1765014324742210, "job": 8, "event": "compaction_finished", "compaction_time_micros": 214203, "compaction_time_cpu_micros": 73553, "output_level": 6, "num_output_files": 1, "total_output_size": 14793424, "num_input_records": 4661, "num_output_records": 4125, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324744087, "job": 8, "event": "table_file_deletion", "file_number": 20}
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014324747192, "job": 8, "event": "table_file_deletion", "file_number": 18}
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.524316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.747301) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.747309) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.747311) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.747313) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:45:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:45:24.747315) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:45:24 compute-1 sudo[94796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rolgyybkcappubdlzuerglyaaoonwweq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014324.4794307-69-137230742736646/AnsiballZ_getent.py'
Dec 06 09:45:24 compute-1 sudo[94796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:24 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:25 compute-1 python3.9[94798]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 06 09:45:25 compute-1 sudo[94796]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:25 compute-1 sudo[94800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:45:25 compute-1 sudo[94800]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:45:25 compute-1 sudo[94800]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:25 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:45:25 compute-1 sudo[94975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yztmyqvmvdmqxqfmipsdtexkmimcriqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014325.5030956-105-96602351146645/AnsiballZ_setup.py'
Dec 06 09:45:25 compute-1 sudo[94975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:25.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:25 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b00091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:26 compute-1 python3.9[94977]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:45:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:45:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:26.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:45:26 compute-1 sudo[94975]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:26 compute-1 ceph-mon[79770]: pgmap v118: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:45:26 compute-1 sudo[95060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmncmrjscdoouctptkdrnhumjpkikjdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014325.5030956-105-96602351146645/AnsiballZ_dnf.py'
Dec 06 09:45:26 compute-1 sudo[95060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:26 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b00091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:26 compute-1 python3.9[95062]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:45:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:27 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:27 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:45:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:45:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:45:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:27.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:45:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:27 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:28.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:28 compute-1 sudo[95060]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:28 compute-1 ceph-mon[79770]: pgmap v119: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 597 B/s wr, 1 op/s
Dec 06 09:45:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:28 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0001670 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:29 compute-1 sudo[95214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inxlezsouygqvzfnpyyjngqsazfboomb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014328.8051546-147-25172811663757/AnsiballZ_dnf.py'
Dec 06 09:45:29 compute-1 sudo[95214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:29 compute-1 python3.9[95216]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:45:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:29 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b00091b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:29.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:29 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:45:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:30.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:45:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:30 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:45:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:30 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:45:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:30 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:45:30 compute-1 sudo[95214]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:30 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:30 compute-1 ceph-mon[79770]: pgmap v120: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 06 09:45:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:31 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0001670 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:31 compute-1 sudo[95368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yatwzvgalymgikhgdgkuyrdcmsuabcti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014331.0061631-171-1926102900408/AnsiballZ_systemd.py'
Dec 06 09:45:31 compute-1 sudo[95368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:31 compute-1 python3.9[95370]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:45:31 compute-1 sudo[95368]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:31.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:31 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:32.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:32 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:45:32 compute-1 python3.9[95524]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:45:32 compute-1 ceph-mon[79770]: pgmap v121: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 06 09:45:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:33 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0002380 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:33 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:45:33 compute-1 sudo[95676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzwfwfcrbttuefadtizkbxfffvqhvqet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014333.1447797-225-214640073074087/AnsiballZ_sefcontext.py'
Dec 06 09:45:33 compute-1 sudo[95676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:33 compute-1 python3.9[95678]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 06 09:45:33 compute-1 sshd-session[95585]: Received disconnect from 193.46.255.99 port 46086:11:  [preauth]
Dec 06 09:45:33 compute-1 sshd-session[95585]: Disconnected from authenticating user root 193.46.255.99 port 46086 [preauth]
Dec 06 09:45:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:33.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:33 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:34 compute-1 sudo[95676]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:45:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:34.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:45:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:34 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:34 compute-1 python3.9[95829]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:45:34 compute-1 ceph-mon[79770]: pgmap v122: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 852 B/s wr, 2 op/s
Dec 06 09:45:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:35 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:35 compute-1 sudo[95985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bofkkudqxsbfspqhhqpcpspyheszvqjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014335.375989-279-114525607702066/AnsiballZ_dnf.py'
Dec 06 09:45:35 compute-1 sudo[95985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:35 compute-1 python3.9[95987]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:45:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:45:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:35.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:45:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:35 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0002380 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:36.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094536 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:45:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:36 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:37 compute-1 ceph-mon[79770]: pgmap v123: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 852 B/s wr, 2 op/s
Dec 06 09:45:37 compute-1 sudo[95985]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:37 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:37 compute-1 sudo[96139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bojdxaohqsrrhogutdvlfdkhwnapttdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014337.4668145-303-261660100671208/AnsiballZ_command.py'
Dec 06 09:45:37 compute-1 sudo[96139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:45:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:37 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:37.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:38 compute-1 python3.9[96141]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:45:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:38.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:38 compute-1 sudo[96139]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:38 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0002380 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:39 compute-1 ceph-mon[79770]: pgmap v124: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:45:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:45:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:39 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:39 compute-1 sudo[96427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvjacilhmrsfacruwdefxxrgdwmlizff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014339.0793176-327-137778892882076/AnsiballZ_file.py'
Dec 06 09:45:39 compute-1 sudo[96427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:39 compute-1 python3.9[96429]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 09:45:39 compute-1 sudo[96427]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:40 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:45:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:40.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:45:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094540 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:45:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:40.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:40 compute-1 python3.9[96580]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:40 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa180003430 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:41 compute-1 sudo[96732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grdyqwcmiodegpudozizhqwdhgvldhnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014340.7722774-375-70392126959674/AnsiballZ_dnf.py'
Dec 06 09:45:41 compute-1 sudo[96732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:41 compute-1 python3.9[96734]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:45:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0002380 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:41 compute-1 ceph-mon[79770]: pgmap v125: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 341 B/s wr, 1 op/s
Dec 06 09:45:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:42 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:42.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:42.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:42 compute-1 sudo[96732]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:42 compute-1 ceph-mon[79770]: pgmap v126: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 341 B/s wr, 1 op/s
Dec 06 09:45:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:42 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:45:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:43 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188000f30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:43 compute-1 sudo[96887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwdzubkieqhmzepaizovdtnpmciyexyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014343.1329603-402-195304874255478/AnsiballZ_dnf.py'
Dec 06 09:45:43 compute-1 sudo[96887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:43 compute-1 python3.9[96889]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:45:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:44 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0003870 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:45:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:44.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:45:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:44.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:44 compute-1 ceph-mon[79770]: pgmap v127: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 341 B/s wr, 1 op/s
Dec 06 09:45:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:44 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:45 compute-1 sudo[96887]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:45 compute-1 sudo[96916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:45:45 compute-1 sudo[96916]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:45:45 compute-1 sudo[96916]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:45 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:45 compute-1 sudo[97066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfmjktstfbwswhqwktrhtephgboixymv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014345.6192632-438-215989281623364/AnsiballZ_stat.py'
Dec 06 09:45:45 compute-1 sudo[97066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:46 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188001dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:46.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:46 compute-1 python3.9[97068]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:45:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:46 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:45:46 compute-1 sudo[97066]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:45:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:46.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:45:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:46 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0003870 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:47 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:47 compute-1 sudo[97221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szxovememykphrjzvhwwlmkknajnwxgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014346.609189-462-15843558679773/AnsiballZ_slurp.py'
Dec 06 09:45:47 compute-1 sudo[97221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:45:47 compute-1 ceph-mon[79770]: pgmap v128: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:45:47 compute-1 python3.9[97223]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Dec 06 09:45:47 compute-1 sudo[97221]: pam_unix(sudo:session): session closed for user root
Dec 06 09:45:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:45:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:48 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0003870 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:48.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:48.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:48 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188001dd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:48 compute-1 ceph-mon[79770]: pgmap v129: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 682 B/s wr, 2 op/s
Dec 06 09:45:49 compute-1 sshd-session[94466]: Connection closed by 192.168.122.30 port 35380
Dec 06 09:45:49 compute-1 sshd-session[94463]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:45:49 compute-1 systemd[1]: session-39.scope: Deactivated successfully.
Dec 06 09:45:49 compute-1 systemd[1]: session-39.scope: Consumed 18.849s CPU time.
Dec 06 09:45:49 compute-1 systemd-logind[788]: Session 39 logged out. Waiting for processes to exit.
Dec 06 09:45:49 compute-1 systemd-logind[788]: Removed session 39.
Dec 06 09:45:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:49 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:45:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:49 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:45:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:49 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:49 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:45:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:50 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:45:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:50.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:45:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:45:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:50.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:45:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:50 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1a0004190 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:51 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:52 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:52.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:52.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:52 compute-1 ceph-mon[79770]: pgmap v130: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Dec 06 09:45:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:45:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:52 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:45:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:54.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:54 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:54.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:54 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:55 compute-1 ceph-mon[79770]: pgmap v131: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Dec 06 09:45:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:55 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:56.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:56 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:45:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:56.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:45:56 compute-1 sshd-session[71307]: Received disconnect from 38.102.83.98 port 58744:11: disconnected by user
Dec 06 09:45:56 compute-1 sshd-session[71307]: Disconnected from user zuul 38.102.83.98 port 58744
Dec 06 09:45:56 compute-1 sshd-session[71304]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:45:56 compute-1 systemd[1]: session-19.scope: Deactivated successfully.
Dec 06 09:45:56 compute-1 systemd[1]: session-19.scope: Consumed 9.994s CPU time.
Dec 06 09:45:56 compute-1 systemd-logind[788]: Session 19 logged out. Waiting for processes to exit.
Dec 06 09:45:56 compute-1 systemd-logind[788]: Removed session 19.
Dec 06 09:45:56 compute-1 ceph-mon[79770]: pgmap v132: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:45:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:45:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:56 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:57 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188002fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:57 compute-1 sshd-session[97255]: Accepted publickey for zuul from 192.168.122.30 port 35742 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:45:57 compute-1 systemd-logind[788]: New session 40 of user zuul.
Dec 06 09:45:57 compute-1 systemd[1]: Started Session 40 of User zuul.
Dec 06 09:45:57 compute-1 sshd-session[97255]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:45:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:45:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:58 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:45:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:45:58.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:45:58 compute-1 ceph-mon[79770]: pgmap v133: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:45:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:45:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:45:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:45:58.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:45:58 compute-1 python3.9[97409]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:45:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094558 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:45:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:58 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:45:59 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:45:59 compute-1 python3.9[97563]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:46:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:00 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188002fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:00.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:00 compute-1 ceph-mon[79770]: pgmap v134: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 09:46:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:00.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:00 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:01 compute-1 python3.9[97757]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:46:01 compute-1 ceph-mon[79770]: pgmap v135: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 09:46:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:01 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:01 compute-1 sshd-session[97258]: Connection closed by 192.168.122.30 port 35742
Dec 06 09:46:01 compute-1 sshd-session[97255]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:46:01 compute-1 systemd[1]: session-40.scope: Deactivated successfully.
Dec 06 09:46:01 compute-1 systemd[1]: session-40.scope: Consumed 2.527s CPU time.
Dec 06 09:46:01 compute-1 systemd-logind[788]: Session 40 logged out. Waiting for processes to exit.
Dec 06 09:46:01 compute-1 systemd-logind[788]: Removed session 40.
Dec 06 09:46:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:02 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:02.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:02.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:02 compute-1 ceph-mon[79770]: pgmap v136: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 09:46:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:46:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:02 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa188002fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:03 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:04 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:04.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:04.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:04 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:05 compute-1 ceph-mon[79770]: pgmap v137: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 09:46:05 compute-1 sudo[97785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:46:05 compute-1 sudo[97785]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:46:05 compute-1 sudo[97785]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:05 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:06 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:46:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:06.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:46:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:46:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:06.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:46:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:06 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa18c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:07 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880040b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:07 compute-1 ceph-mon[79770]: pgmap v138: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:46:07 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:46:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:08 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:08.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:46:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:08.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:46:08 compute-1 sshd-session[97812]: Accepted publickey for zuul from 192.168.122.30 port 55098 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:46:08 compute-1 systemd-logind[788]: New session 41 of user zuul.
Dec 06 09:46:08 compute-1 systemd[1]: Started Session 41 of User zuul.
Dec 06 09:46:08 compute-1 sshd-session[97812]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:46:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:08 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:09 compute-1 ceph-mon[79770]: pgmap v139: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:46:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:09 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:10 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:10 compute-1 python3.9[97967]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:46:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:10.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:10 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:46:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:46:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:10.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:46:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:10 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:11 compute-1 python3.9[98122]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:46:11 compute-1 ceph-mon[79770]: pgmap v140: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:46:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:11 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:11 compute-1 sudo[98276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkneeufpegmvsqvzezctuqswbbyclwpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014371.5533304-81-251462579180771/AnsiballZ_setup.py'
Dec 06 09:46:11 compute-1 sudo[98276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:12 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:12.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:12 compute-1 python3.9[98278]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:46:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:12.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:12 compute-1 sudo[98276]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:12 compute-1 sudo[98361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btfvnvapscelljptpgvoshbcerrbfxuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014371.5533304-81-251462579180771/AnsiballZ_dnf.py'
Dec 06 09:46:12 compute-1 sudo[98361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:12 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:46:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:12 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:13 compute-1 python3.9[98363]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:46:13 compute-1 ceph-mon[79770]: pgmap v141: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:46:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:13 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1740016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:14 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:14.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:46:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:14.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:46:14 compute-1 sudo[98361]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:14 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:14 compute-1 sudo[98515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlojbcichykycmrgkucroesabmxeylms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014374.6992276-117-185459735435961/AnsiballZ_setup.py'
Dec 06 09:46:15 compute-1 sudo[98515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:15 compute-1 ceph-mon[79770]: pgmap v142: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:46:15 compute-1 python3.9[98517]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:46:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:15 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:15 compute-1 sudo[98515]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:16 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1740016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:16.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:16.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:16 compute-1 sudo[98713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxcmavkejtpicboelqdutitkmyapaphl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014376.014304-150-94279067268000/AnsiballZ_file.py'
Dec 06 09:46:16 compute-1 sudo[98713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:16 compute-1 python3.9[98715]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:16 compute-1 sudo[98713]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:16 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:17 compute-1 ceph-mon[79770]: pgmap v143: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:46:17 compute-1 sshd-session[98661]: Received disconnect from 193.46.255.217 port 53290:11:  [preauth]
Dec 06 09:46:17 compute-1 sshd-session[98661]: Disconnected from authenticating user root 193.46.255.217 port 53290 [preauth]
Dec 06 09:46:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:17 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:17 compute-1 sudo[98865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlbkqxjiztartnnurpiehhmxupgmfvhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014377.0572767-174-86662148310549/AnsiballZ_command.py'
Dec 06 09:46:17 compute-1 sudo[98865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:17 compute-1 python3.9[98867]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:46:17 compute-1 sudo[98865]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:17 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:46:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:18 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:18.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:46:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:18.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:46:18 compute-1 sudo[99030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmajwsocqexjoypqbioqnvocnuyujhsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014378.054939-198-110954358190382/AnsiballZ_stat.py'
Dec 06 09:46:18 compute-1 sudo[99030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:18 compute-1 python3.9[99032]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:18 compute-1 sudo[99030]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:18 compute-1 sudo[99108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jozlwxwecdwdicegyhqbavsppzvdbdrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014378.054939-198-110954358190382/AnsiballZ_file.py'
Dec 06 09:46:18 compute-1 sudo[99108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:18 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1740016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:19 compute-1 python3.9[99110]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:19 compute-1 sudo[99108]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:19 compute-1 ceph-mon[79770]: pgmap v144: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:46:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:19 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:19 compute-1 sudo[99260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpuqsnsjtxyxsugsjcowjwrkteqgwabq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014379.376069-234-86291807200143/AnsiballZ_stat.py'
Dec 06 09:46:19 compute-1 sudo[99260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:19 compute-1 python3.9[99262]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:19 compute-1 sudo[99260]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:20 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:20.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:20.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:20 compute-1 sudo[99339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xssilqmdatezsuslchlpqqnhlnmymqtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014379.376069-234-86291807200143/AnsiballZ_file.py'
Dec 06 09:46:20 compute-1 sudo[99339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:20 compute-1 python3.9[99341]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:46:20 compute-1 sudo[99339]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:20 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:21 compute-1 ceph-mon[79770]: pgmap v145: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:46:21 compute-1 sudo[99491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrfwelazmcmvlkaulnjiybixyyazzjyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014380.8773534-273-6192592912558/AnsiballZ_ini_file.py'
Dec 06 09:46:21 compute-1 sudo[99491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:21 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:21 compute-1 python3.9[99493]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:46:21 compute-1 sudo[99491]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:21 compute-1 sudo[99643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erbztsvxrqhfnovgmsxaexwhkgaufroi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014381.7021677-273-34637499488424/AnsiballZ_ini_file.py'
Dec 06 09:46:21 compute-1 sudo[99643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:22 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:22.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:22 compute-1 python3.9[99645]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:46:22 compute-1 sudo[99643]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:46:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:22.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:46:22 compute-1 sudo[99796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqlnwctenpwvprylwowyeziyfngsirai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014382.3173256-273-5904409244169/AnsiballZ_ini_file.py'
Dec 06 09:46:22 compute-1 sudo[99796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:22 compute-1 python3.9[99798]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:46:22 compute-1 sudo[99796]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:22 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:46:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:22 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:23 compute-1 sudo[99948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfrydfccqnfgooiwgfpatomatkpehaep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014382.967036-273-144986236933371/AnsiballZ_ini_file.py'
Dec 06 09:46:23 compute-1 sudo[99948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:23 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:23 compute-1 python3.9[99950]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:46:23 compute-1 sudo[99948]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:23 compute-1 ceph-mon[79770]: pgmap v146: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:46:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:24 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:24.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:46:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:24.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:46:24 compute-1 sudo[100101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azfxlqdebdwsobuhhknhbomkoojtcjlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014383.9663632-366-258978231736544/AnsiballZ_dnf.py'
Dec 06 09:46:24 compute-1 sudo[100101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:24 compute-1 ceph-mon[79770]: pgmap v147: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:46:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:46:24 compute-1 python3.9[100103]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:46:24 compute-1 sudo[100105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:46:24 compute-1 sudo[100105]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:46:24 compute-1 sudo[100105]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:24 compute-1 sudo[100130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:46:24 compute-1 sudo[100130]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:46:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:24 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:25 compute-1 sudo[100130]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:25 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:25 compute-1 sudo[100186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:46:25 compute-1 sudo[100186]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:46:25 compute-1 sudo[100186]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:26 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:26 compute-1 sudo[100101]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:26.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:26.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:46:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:46:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:26 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:27 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:27 compute-1 sudo[100361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zksokmvsgimyhpcwrmuwivdodhizgmwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014387.1737003-399-158015444506552/AnsiballZ_setup.py'
Dec 06 09:46:27 compute-1 sudo[100361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:27 compute-1 ceph-mon[79770]: pgmap v148: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:46:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:46:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:46:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:46:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:46:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:46:27 compute-1 python3.9[100363]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:46:27 compute-1 sudo[100361]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:46:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:28 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:28.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:28.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:28 compute-1 sudo[100516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vudumzpdifgvllqtwzfjgoicjjxpklwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014388.0599334-423-231785678112127/AnsiballZ_stat.py'
Dec 06 09:46:28 compute-1 sudo[100516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:28 compute-1 python3.9[100518]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:46:28 compute-1 sudo[100516]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:28 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:29 compute-1 sudo[100668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzbhouiffkfdlyewuykwcyrvrpsdixdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014389.000161-450-199921524949186/AnsiballZ_stat.py'
Dec 06 09:46:29 compute-1 sudo[100668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:29 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:29 compute-1 python3.9[100670]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:46:29 compute-1 sudo[100668]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:30 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:30.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:30 compute-1 sudo[100820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbyljkbfrhbfpiloqvrxghmdhcrjamun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014389.8743558-480-204327774874475/AnsiballZ_command.py'
Dec 06 09:46:30 compute-1 sudo[100820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:30.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:30 compute-1 python3.9[100822]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:46:30 compute-1 sudo[100820]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:30 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:31 compute-1 sudo[100974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxzfjkjvumjhhanjfhkoihtkjulawvuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014390.8439438-510-184074475787337/AnsiballZ_service_facts.py'
Dec 06 09:46:31 compute-1 sudo[100974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:31 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:31 compute-1 python3.9[100976]: ansible-service_facts Invoked
Dec 06 09:46:31 compute-1 network[100993]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:46:31 compute-1 network[100994]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:46:31 compute-1 network[100995]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:46:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:32 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:32.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:32.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:46:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:32 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:33 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a2f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:33 compute-1 ceph-mon[79770]: pgmap v149: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:46:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:34 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:34.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:34.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:34 compute-1 ceph-mon[79770]: pgmap v150: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:46:34 compute-1 ceph-mon[79770]: pgmap v151: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:46:34 compute-1 ceph-mon[79770]: pgmap v152: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:46:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:34 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:35 compute-1 sudo[100974]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:35 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003ce0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:36 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a310 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:46:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:36.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:46:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:46:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:36.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:46:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:36 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a310 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:37 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:37 compute-1 ceph-mon[79770]: pgmap v153: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:46:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:46:38 compute-1 sudo[101281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sprreuopliuqdkroqvrnqllcnedsmzau ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1765014397.5795653-555-88089312270167/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1765014397.5795653-555-88089312270167/args'
Dec 06 09:46:38 compute-1 sudo[101281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:38 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:38.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:38 compute-1 sudo[101281]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:38.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:38 compute-1 sudo[101449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reiimyibodroycihaicnpksghvvqoxmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014398.6038692-588-196725538482918/AnsiballZ_dnf.py'
Dec 06 09:46:38 compute-1 sudo[101449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:38 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa17c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:39 compute-1 python3.9[101451]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:46:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:39 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a4b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:39 compute-1 ceph-mon[79770]: pgmap v154: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 09:46:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:46:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:40 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:40.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:40 compute-1 sudo[101456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:46:40 compute-1 sudo[101456]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:46:40 compute-1 sudo[101456]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:40.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:40 compute-1 sudo[101449]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:40 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:41 compute-1 ceph-mon[79770]: pgmap v155: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:46:41 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:46:41 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:46:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:41 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880019e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:41 compute-1 sudo[101631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltfsccolbtabcgzhtmlkfbneazzxqtyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014401.180526-627-172494687840511/AnsiballZ_package_facts.py'
Dec 06 09:46:41 compute-1 sudo[101631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:42 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a4b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:42.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:42 compute-1 python3.9[101633]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 06 09:46:42 compute-1 sudo[101631]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:42.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094642 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:46:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:46:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:42 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:43 compute-1 ceph-mon[79770]: pgmap v156: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:46:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:43 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:43 compute-1 sudo[101784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynvtzdoagdjuqustbiwlvqhyyvfbsnlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014403.2724788-657-233365135844091/AnsiballZ_stat.py'
Dec 06 09:46:43 compute-1 sudo[101784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:43 compute-1 python3.9[101786]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:43 compute-1 sudo[101784]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:44 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880019e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:44 compute-1 sudo[101862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svjfdftykgrcbewocqeehmihwpyipmly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014403.2724788-657-233365135844091/AnsiballZ_file.py'
Dec 06 09:46:44 compute-1 sudo[101862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:44.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:44 compute-1 python3.9[101864]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:46:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:44.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:46:44 compute-1 sudo[101862]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:44 compute-1 sudo[102015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itnblyhkcnlwsjedbvvkjfsvoecczkzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014404.5908487-694-258257534250737/AnsiballZ_stat.py'
Dec 06 09:46:44 compute-1 sudo[102015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:44 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a4b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:45 compute-1 python3.9[102017]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:45 compute-1 sudo[102015]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:45 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:45 compute-1 sudo[102093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhsodsrurqyvzqhaslqyjyygyhxlbzrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014404.5908487-694-258257534250737/AnsiballZ_file.py'
Dec 06 09:46:45 compute-1 sudo[102093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:45 compute-1 python3.9[102095]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:45 compute-1 sudo[102093]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:45 compute-1 ceph-mon[79770]: pgmap v157: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:46:45 compute-1 sudo[102120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:46:45 compute-1 sudo[102120]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:46:45 compute-1 sudo[102120]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:46 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:46.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:46.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:46 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880019e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:47 compute-1 ceph-mon[79770]: pgmap v158: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:46:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:47 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a4b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:47 compute-1 sudo[102271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqpzpyoqlnfsrfrdmwxodwcebyoldpev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014407.0785553-748-208940025541996/AnsiballZ_lineinfile.py'
Dec 06 09:46:47 compute-1 sudo[102271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:47 compute-1 python3.9[102273]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:47 compute-1 sudo[102271]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:46:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:48 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:46:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:48.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:46:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:48.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:48 compute-1 systemd[81504]: Created slice User Background Tasks Slice.
Dec 06 09:46:48 compute-1 systemd[81504]: Starting Cleanup of User's Temporary Files and Directories...
Dec 06 09:46:48 compute-1 systemd[81504]: Finished Cleanup of User's Temporary Files and Directories.
Dec 06 09:46:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:48 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:49 compute-1 sudo[102425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clapitfawhtussuwryjpohdlehvpvxuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014408.7622495-792-110912173076370/AnsiballZ_setup.py'
Dec 06 09:46:49 compute-1 sudo[102425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:49 compute-1 ceph-mon[79770]: pgmap v159: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 09:46:49 compute-1 python3.9[102427]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:46:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:49 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880019e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:49 compute-1 sudo[102425]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:50 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a4b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:50 compute-1 sudo[102509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evcraghoicvkipjtnzmuoiuujwchopgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014408.7622495-792-110912173076370/AnsiballZ_systemd.py'
Dec 06 09:46:50 compute-1 sudo[102509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:50.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:50.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:50 compute-1 python3.9[102511]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:46:50 compute-1 sudo[102509]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:50 compute-1 ceph-mon[79770]: pgmap v160: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:46:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:50 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa174003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:51 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa198003d20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:51 compute-1 sshd-session[97815]: Connection closed by 192.168.122.30 port 55098
Dec 06 09:46:51 compute-1 sshd-session[97812]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:46:51 compute-1 systemd[1]: session-41.scope: Deactivated successfully.
Dec 06 09:46:51 compute-1 systemd[1]: session-41.scope: Consumed 25.288s CPU time.
Dec 06 09:46:51 compute-1 systemd-logind[788]: Session 41 logged out. Waiting for processes to exit.
Dec 06 09:46:51 compute-1 systemd-logind[788]: Removed session 41.
Dec 06 09:46:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:52 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1880019e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:46:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:52.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:46:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:52.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:46:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:52 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a4b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:46:53 compute-1 ceph-mon[79770]: pgmap v161: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:46:53 compute-1 kernel: ganesha.nfsd[91724]: segfault at 50 ip 00007fa25de1e32e sp 00007fa22dffa210 error 4 in libntirpc.so.5.8[7fa25de03000+2c000] likely on CPU 3 (core 0, socket 3)
Dec 06 09:46:53 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 06 09:46:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[90196]: 06/12/2025 09:46:53 : epoch 6933fb09 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa1b000a4b0 fd 39 proxy ignored for local
Dec 06 09:46:53 compute-1 systemd[1]: Started Process Core Dump (PID 102541/UID 0).
Dec 06 09:46:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:46:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:54.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:46:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:46:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:54.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:55 compute-1 systemd-coredump[102542]: Process 90200 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 41:
                                                    #0  0x00007fa25de1e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 06 09:46:55 compute-1 systemd[1]: systemd-coredump@2-102541-0.service: Deactivated successfully.
Dec 06 09:46:55 compute-1 systemd[1]: systemd-coredump@2-102541-0.service: Consumed 1.819s CPU time.
Dec 06 09:46:55 compute-1 podman[102548]: 2025-12-06 09:46:55.469733439 +0000 UTC m=+0.050723075 container died 490bcdc1ddf2a147605f7bef7763287ae9d25da8b09ab41fcfcd1cec65c24755 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 09:46:55 compute-1 systemd[1]: var-lib-containers-storage-overlay-0bbbd98a88994f54839b8379f302a87baf27efd11c17b9c4f84aad6e60a7f0d8-merged.mount: Deactivated successfully.
Dec 06 09:46:55 compute-1 podman[102548]: 2025-12-06 09:46:55.513934815 +0000 UTC m=+0.094924401 container remove 490bcdc1ddf2a147605f7bef7763287ae9d25da8b09ab41fcfcd1cec65c24755 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Dec 06 09:46:55 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec 06 09:46:55 compute-1 ceph-mon[79770]: pgmap v162: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:46:55 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec 06 09:46:55 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.346s CPU time.
Dec 06 09:46:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:46:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:56.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:46:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:56.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:56 compute-1 ceph-mon[79770]: pgmap v163: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:46:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:46:58 compute-1 sshd-session[102595]: Accepted publickey for zuul from 192.168.122.30 port 46732 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:46:58 compute-1 systemd-logind[788]: New session 42 of user zuul.
Dec 06 09:46:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:46:58.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:58 compute-1 systemd[1]: Started Session 42 of User zuul.
Dec 06 09:46:58 compute-1 sshd-session[102595]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:46:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:46:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:46:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:46:58.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:46:58 compute-1 sudo[102749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggzljvyhhzmoqggkjreergwzuqjmzzzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014418.2591403-27-60566545251263/AnsiballZ_file.py'
Dec 06 09:46:58 compute-1 sudo[102749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:59 compute-1 python3.9[102751]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:46:59 compute-1 sudo[102749]: pam_unix(sudo:session): session closed for user root
Dec 06 09:46:59 compute-1 ceph-mon[79770]: pgmap v164: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 09:46:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094659 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:46:59 compute-1 sudo[102901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocbddczddoqwnupqszmsycufwacwpcog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014419.2194657-63-136919981328437/AnsiballZ_stat.py'
Dec 06 09:46:59 compute-1 sudo[102901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:46:59 compute-1 python3.9[102903]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:46:59 compute-1 sudo[102901]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:00.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:00 compute-1 sudo[102979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkjcitlwxdjjioupjmdhwetabxmewuml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014419.2194657-63-136919981328437/AnsiballZ_file.py'
Dec 06 09:47:00 compute-1 sudo[102979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:47:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:00.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:47:00 compute-1 python3.9[102982]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:00 compute-1 sudo[102979]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:00 compute-1 sshd-session[102598]: Connection closed by 192.168.122.30 port 46732
Dec 06 09:47:00 compute-1 sshd-session[102595]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:47:00 compute-1 systemd[1]: session-42.scope: Deactivated successfully.
Dec 06 09:47:00 compute-1 systemd[1]: session-42.scope: Consumed 1.729s CPU time.
Dec 06 09:47:00 compute-1 systemd-logind[788]: Session 42 logged out. Waiting for processes to exit.
Dec 06 09:47:00 compute-1 systemd-logind[788]: Removed session 42.
Dec 06 09:47:01 compute-1 ceph-mon[79770]: pgmap v165: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 09:47:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:47:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:02.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:47:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:02.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:02 compute-1 ceph-mon[79770]: pgmap v166: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 09:47:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:47:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:04.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:04.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:05 compute-1 ceph-mon[79770]: pgmap v167: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 426 B/s wr, 2 op/s
Dec 06 09:47:05 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 3.
Dec 06 09:47:05 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:47:05 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.346s CPU time.
Dec 06 09:47:05 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:47:06 compute-1 sudo[103023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:47:06 compute-1 sudo[103023]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:47:06 compute-1 sudo[103023]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:06.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:06 compute-1 podman[103083]: 2025-12-06 09:47:06.195367804 +0000 UTC m=+0.047598459 container create 85455d3243db1463a68bc3199c944543828c9d708094c65b0309507d9efc87ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:47:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb68c9c4605a82178b215e3dd6a8db8454de491b3741ae6c6873e5884bb45d11/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 06 09:47:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb68c9c4605a82178b215e3dd6a8db8454de491b3741ae6c6873e5884bb45d11/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:47:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb68c9c4605a82178b215e3dd6a8db8454de491b3741ae6c6873e5884bb45d11/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:47:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cb68c9c4605a82178b215e3dd6a8db8454de491b3741ae6c6873e5884bb45d11/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:47:06 compute-1 podman[103083]: 2025-12-06 09:47:06.259932786 +0000 UTC m=+0.112163471 container init 85455d3243db1463a68bc3199c944543828c9d708094c65b0309507d9efc87ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 06 09:47:06 compute-1 podman[103083]: 2025-12-06 09:47:06.266954177 +0000 UTC m=+0.119184832 container start 85455d3243db1463a68bc3199c944543828c9d708094c65b0309507d9efc87ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Dec 06 09:47:06 compute-1 podman[103083]: 2025-12-06 09:47:06.173328058 +0000 UTC m=+0.025558733 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:47:06 compute-1 bash[103083]: 85455d3243db1463a68bc3199c944543828c9d708094c65b0309507d9efc87ab
Dec 06 09:47:06 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:47:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 06 09:47:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 06 09:47:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:06.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 06 09:47:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 06 09:47:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 06 09:47:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 06 09:47:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 06 09:47:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:47:07 compute-1 sshd-session[103141]: Accepted publickey for zuul from 192.168.122.30 port 60946 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:47:07 compute-1 systemd-logind[788]: New session 43 of user zuul.
Dec 06 09:47:07 compute-1 systemd[1]: Started Session 43 of User zuul.
Dec 06 09:47:07 compute-1 sshd-session[103141]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:47:07 compute-1 ceph-mon[79770]: pgmap v168: 337 pgs: 337 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 341 B/s wr, 1 op/s
Dec 06 09:47:07 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:47:08 compute-1 python3.9[103294]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:47:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:08.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:08.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:09 compute-1 sudo[103449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyhubzbydafcdwmqxxxwfzawctqqeroc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014428.7479386-60-21962456664570/AnsiballZ_file.py'
Dec 06 09:47:09 compute-1 sudo[103449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:09 compute-1 python3.9[103451]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:09 compute-1 sudo[103449]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:09 compute-1 ceph-mon[79770]: pgmap v169: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 682 B/s wr, 3 op/s
Dec 06 09:47:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:47:10 compute-1 sudo[103624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkefxlgswnyzkoopglwfylbdpzkfkbmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014429.6746457-84-104487260890877/AnsiballZ_stat.py'
Dec 06 09:47:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:10.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:10 compute-1 sudo[103624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:10 compute-1 python3.9[103626]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:10.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:10 compute-1 sudo[103624]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:10 compute-1 sudo[103703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kufbxcdonqiozcskvngjuvxhehsmhkdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014429.6746457-84-104487260890877/AnsiballZ_file.py'
Dec 06 09:47:10 compute-1 sudo[103703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:10 compute-1 python3.9[103705]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.9xj64iw3 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:10 compute-1 sudo[103703]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:11 compute-1 ceph-mon[79770]: pgmap v170: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 341 B/s wr, 1 op/s
Dec 06 09:47:11 compute-1 sudo[103855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txeuwfsgddenksjfioqtpqpymcepdqnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014431.3594167-144-164489022025969/AnsiballZ_stat.py'
Dec 06 09:47:11 compute-1 sudo[103855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:11 compute-1 python3.9[103857]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:11 compute-1 sudo[103855]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:12 compute-1 sudo[103933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qujkxsirffvczkmhydgudnqnjivcihbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014431.3594167-144-164489022025969/AnsiballZ_file.py'
Dec 06 09:47:12 compute-1 sudo[103933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:47:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:12.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:47:12 compute-1 python3.9[103935]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.xqifjiqg recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:12 compute-1 sudo[103933]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:12.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Dec 06 09:47:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Dec 06 09:47:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:47:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:47:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 09:47:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:47:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:47:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:47:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 09:47:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:47:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:47:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:47:12 compute-1 sudo[104086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaezrtmwbszojhafajiepjsklsdubzwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014432.5729384-183-30444994834275/AnsiballZ_file.py'
Dec 06 09:47:12 compute-1 sudo[104086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:12 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:47:13 compute-1 python3.9[104088]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:47:13 compute-1 sudo[104086]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:13 compute-1 ceph-mon[79770]: pgmap v171: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 341 B/s wr, 1 op/s
Dec 06 09:47:13 compute-1 sudo[104238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiqelmlvdnnxdpvglcpsqwqcbrbgahvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014433.3112085-207-155795655658080/AnsiballZ_stat.py'
Dec 06 09:47:13 compute-1 sudo[104238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:13 compute-1 python3.9[104240]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:13 compute-1 sudo[104238]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:14 compute-1 sudo[104316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnxufvohekaocdpdtytiodvrohsxkeqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014433.3112085-207-155795655658080/AnsiballZ_file.py'
Dec 06 09:47:14 compute-1 sudo[104316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:14.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:14 compute-1 python3.9[104318]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:47:14 compute-1 sudo[104316]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:14.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:14 compute-1 sudo[104469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjarkgnfsgikbsqmujcwpgduzqzdegns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014434.373219-207-226185203657884/AnsiballZ_stat.py'
Dec 06 09:47:14 compute-1 sudo[104469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:14 compute-1 python3.9[104471]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:14 compute-1 sudo[104469]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094714 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 7ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:47:15 compute-1 sudo[104547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghrbqixhmieoalwqxerpesywpzjduwdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014434.373219-207-226185203657884/AnsiballZ_file.py'
Dec 06 09:47:15 compute-1 sudo[104547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:15 compute-1 python3.9[104549]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:47:15 compute-1 sudo[104547]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:16 compute-1 sudo[104699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sevnsvnbacixyyxechshreawkcscgsjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014435.7782478-276-82111949587087/AnsiballZ_file.py'
Dec 06 09:47:16 compute-1 sudo[104699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:16.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:16 compute-1 python3.9[104701]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:16 compute-1 ceph-mon[79770]: pgmap v172: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.9 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Dec 06 09:47:16 compute-1 sudo[104699]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:16.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:16 compute-1 sudo[104852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbdbwuzqyeskihygjqbhnhojoohtpxbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014436.5647388-300-97249284354154/AnsiballZ_stat.py'
Dec 06 09:47:16 compute-1 sudo[104852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:17 compute-1 python3.9[104854]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:17 compute-1 sudo[104852]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:17 compute-1 sudo[104930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqyzxfnrwducgsrpzemhvnsqnvpxutpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014436.5647388-300-97249284354154/AnsiballZ_file.py'
Dec 06 09:47:17 compute-1 sudo[104930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:17 compute-1 ceph-mon[79770]: pgmap v173: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Dec 06 09:47:17 compute-1 python3.9[104932]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:17 compute-1 sudo[104930]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:17 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:47:18 compute-1 sudo[105082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfeqsdpqhoqniwefghukblmsrasthhcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014437.8692334-336-68274889383592/AnsiballZ_stat.py'
Dec 06 09:47:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.002000049s ======
Dec 06 09:47:18 compute-1 sudo[105082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:18.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000049s
Dec 06 09:47:18 compute-1 python3.9[105084]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:18.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:18 compute-1 sudo[105082]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000009:nfs.cephfs.0: -2
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 06 09:47:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 09:47:18 compute-1 sudo[105174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smujtoiquktyhdvrxiabidrbjkvkyrrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014437.8692334-336-68274889383592/AnsiballZ_file.py'
Dec 06 09:47:18 compute-1 sudo[105174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:18 compute-1 ceph-mon[79770]: pgmap v174: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.5 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Dec 06 09:47:18 compute-1 python3.9[105176]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:18 compute-1 sudo[105174]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:19 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:19 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c340016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:19 compute-1 sudo[105329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufmnpmysrisqfodplqggfvfqvtgdtttc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014439.2602751-372-124965409470865/AnsiballZ_systemd.py'
Dec 06 09:47:19 compute-1 sudo[105329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:20 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:20.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:20 compute-1 python3.9[105331]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:47:20 compute-1 systemd[1]: Reloading.
Dec 06 09:47:20 compute-1 systemd-rc-local-generator[105361]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:47:20 compute-1 systemd-sysv-generator[105365]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:47:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:20.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:20 compute-1 sudo[105329]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:21 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:21 compute-1 sudo[105520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yowpedvgqnssjkuinlkqphlukrbmjiig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014440.9120972-396-208718571191358/AnsiballZ_stat.py'
Dec 06 09:47:21 compute-1 sudo[105520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:21 compute-1 ceph-mon[79770]: pgmap v175: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 09:47:21 compute-1 python3.9[105522]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094721 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:47:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:21 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:21 compute-1 sudo[105520]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:21 compute-1 sudo[105598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgmimklnsonboluhpkdhfpzonqpeacin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014440.9120972-396-208718571191358/AnsiballZ_file.py'
Dec 06 09:47:21 compute-1 sudo[105598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:21 compute-1 python3.9[105600]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:21 compute-1 sudo[105598]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:22 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:22.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:47:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:22.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:47:22 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:47:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:23 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:23 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:23 compute-1 ceph-mon[79770]: pgmap v176: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 09:47:23 compute-1 sudo[105751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjywvnaehxffdafclnzhokfiychodbqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014443.30951-438-76221303021041/AnsiballZ_stat.py'
Dec 06 09:47:23 compute-1 sudo[105751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:23 compute-1 python3.9[105753]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:23 compute-1 sudo[105751]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:24 compute-1 sudo[105829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bglnhpdpgsggytpasxjhvriwgqocljge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014443.30951-438-76221303021041/AnsiballZ_file.py'
Dec 06 09:47:24 compute-1 sudo[105829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:24 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:24.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:24 compute-1 python3.9[105831]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:24 compute-1 sudo[105829]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:24.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:24 compute-1 sudo[105982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwqrppknvysohlafjruilcftorsiqyan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014444.51627-468-191213543014554/AnsiballZ_systemd.py'
Dec 06 09:47:24 compute-1 sudo[105982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:25 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:25 compute-1 python3.9[105984]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:47:25 compute-1 systemd[1]: Reloading.
Dec 06 09:47:25 compute-1 systemd-rc-local-generator[106012]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:47:25 compute-1 systemd-sysv-generator[106015]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:47:25 compute-1 systemd[1]: Starting Create netns directory...
Dec 06 09:47:25 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:47:25 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:47:25 compute-1 systemd[1]: Finished Create netns directory.
Dec 06 09:47:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:25 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:25 compute-1 sudo[105982]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:25 compute-1 ceph-mon[79770]: pgmap v177: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 09:47:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:47:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:26 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:26 compute-1 sudo[106102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:47:26 compute-1 sudo[106102]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:47:26 compute-1 sudo[106102]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:26.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:26.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:26 compute-1 python3.9[106201]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:47:26 compute-1 network[106218]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:47:26 compute-1 network[106219]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:47:26 compute-1 network[106220]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:47:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:27 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:27 compute-1 ceph-mon[79770]: pgmap v178: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Dec 06 09:47:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:27 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:47:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:28 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c200016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:28.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:28.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:29 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:29 compute-1 ceph-mon[79770]: pgmap v179: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 255 B/s wr, 1 op/s
Dec 06 09:47:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:29 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:30 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:30.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:30.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:31 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:31 compute-1 sudo[106482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzwigcuhcmasdsfjfftwxoxjrilicnqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014450.9971025-546-119762302376081/AnsiballZ_stat.py'
Dec 06 09:47:31 compute-1 sudo[106482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:31 compute-1 ceph-mon[79770]: pgmap v180: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Dec 06 09:47:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:31 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:31 compute-1 python3.9[106484]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:31 compute-1 sudo[106482]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:31 compute-1 sudo[106560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrbfkdbhugtqjwtmirrzviekmxdjpxqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014450.9971025-546-119762302376081/AnsiballZ_file.py'
Dec 06 09:47:31 compute-1 sudo[106560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:31 compute-1 python3.9[106562]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:31 compute-1 sudo[106560]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:32 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:32.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:32.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:32 compute-1 sudo[106713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljeohhxsnymdaptrmmbfgptfldayuukz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014452.214201-585-185292106845969/AnsiballZ_file.py'
Dec 06 09:47:32 compute-1 sudo[106713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:32 compute-1 python3.9[106715]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:32 compute-1 sudo[106713]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:32 compute-1 ceph-mon[79770]: pgmap v181: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Dec 06 09:47:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:47:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094732 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:47:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:33 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:33 compute-1 sudo[106865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgpkociudvjxbfenllunwrfmlgufhisx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014452.9810326-609-158475485559820/AnsiballZ_stat.py'
Dec 06 09:47:33 compute-1 sudo[106865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:33 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:33 compute-1 python3.9[106867]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:33 compute-1 sudo[106865]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:33 compute-1 sudo[106943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fapwbwldttlfbwvpxsjlriaisktuucnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014452.9810326-609-158475485559820/AnsiballZ_file.py'
Dec 06 09:47:33 compute-1 sudo[106943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:33 compute-1 python3.9[106945]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:33 compute-1 sudo[106943]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:34 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:34.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:34.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:34 compute-1 ceph-mon[79770]: pgmap v182: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 06 09:47:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:35 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:35 compute-1 sudo[107096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcfpjitpdxccdmqofkfksrxvltcjnpjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014454.6826277-654-119218255326367/AnsiballZ_timezone.py'
Dec 06 09:47:35 compute-1 sudo[107096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:35 compute-1 python3.9[107098]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 06 09:47:35 compute-1 systemd[1]: Starting Time & Date Service...
Dec 06 09:47:35 compute-1 systemd[1]: Started Time & Date Service.
Dec 06 09:47:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:35 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:35 compute-1 sudo[107096]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:36 compute-1 sudo[107252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvjvgkgwnjmwpdqeynajhyntdxitofys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014455.8216896-681-67907710454243/AnsiballZ_file.py'
Dec 06 09:47:36 compute-1 sudo[107252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:36 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:36.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:36 compute-1 python3.9[107254]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:36 compute-1 sudo[107252]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:36.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:36 compute-1 sudo[107405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haawyxzmbxzwolhwjktigemkyqqpftdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014456.5034492-705-26840847592252/AnsiballZ_stat.py'
Dec 06 09:47:36 compute-1 sudo[107405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:36 compute-1 python3.9[107407]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:36 compute-1 sudo[107405]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:36 compute-1 ceph-mon[79770]: pgmap v183: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:47:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:37 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:37 compute-1 sudo[107483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfhcjwertajovmxdaejavubbndmpuujd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014456.5034492-705-26840847592252/AnsiballZ_file.py'
Dec 06 09:47:37 compute-1 sudo[107483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:37 compute-1 python3.9[107485]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:37 compute-1 sudo[107483]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:37 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:47:38 compute-1 sudo[107635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikzgsspuqmczdlelpdxqsatloqmcnkhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014457.745499-741-83099395874427/AnsiballZ_stat.py'
Dec 06 09:47:38 compute-1 sudo[107635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:38 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:38.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:38 compute-1 python3.9[107637]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:38 compute-1 sudo[107635]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:38.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:38 compute-1 sudo[107714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhfpjoqnveubcyaxmbortbqwrpochrgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014457.745499-741-83099395874427/AnsiballZ_file.py'
Dec 06 09:47:38 compute-1 sudo[107714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:38 compute-1 python3.9[107716]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.zdxwm14a recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:38 compute-1 sudo[107714]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:39 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:39 compute-1 ceph-mon[79770]: pgmap v184: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:47:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:47:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:39 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:39 compute-1 sudo[107866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxkdsyyywwuciwjbidcsssmzfpxaspdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014459.2599351-777-134186337128138/AnsiballZ_stat.py'
Dec 06 09:47:39 compute-1 sudo[107866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:39 compute-1 python3.9[107868]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:39 compute-1 sudo[107866]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:39 compute-1 sudo[107944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwxnqxcriajmuyyvxzcnqmpgbrrxkveo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014459.2599351-777-134186337128138/AnsiballZ_file.py'
Dec 06 09:47:39 compute-1 sudo[107944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:40 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:40 compute-1 python3.9[107946]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:40 compute-1 sudo[107944]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:40.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:40 compute-1 sudo[107972]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:47:40 compute-1 sudo[107972]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:47:40 compute-1 sudo[107972]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:40.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:40 compute-1 sudo[107997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:47:40 compute-1 sudo[107997]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:47:40 compute-1 ceph-mon[79770]: pgmap v185: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:47:40 compute-1 sudo[108167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lagkfktbkijqnlnzjqhefromzkjfmccx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014460.488854-816-209503054643404/AnsiballZ_command.py'
Dec 06 09:47:40 compute-1 sudo[108167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:40 compute-1 sudo[107997]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:41 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:41 compute-1 python3.9[108170]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:47:41 compute-1 sudo[108167]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:41 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:41 compute-1 sudo[108332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufihrzyokgefofrrbmaucpoomlcrralj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014461.3502119-840-191161795520145/AnsiballZ_edpm_nftables_from_files.py'
Dec 06 09:47:41 compute-1 sudo[108332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:41 compute-1 python3[108334]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 06 09:47:42 compute-1 sudo[108332]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:42 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:42 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:47:42 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:47:42 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:47:42 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:47:42 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:47:42 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:47:42 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:47:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:42 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:47:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:42.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:42.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:42 compute-1 sudo[108485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twpmffirfnawwaorfvgefnmbowclepgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014462.294001-865-222045380770886/AnsiballZ_stat.py'
Dec 06 09:47:42 compute-1 sudo[108485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:42 compute-1 python3.9[108487]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:42 compute-1 sudo[108485]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:47:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:43 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:43 compute-1 sudo[108563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euuptzbbrzbyuvalswtzwwsdmsaiurhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014462.294001-865-222045380770886/AnsiballZ_file.py'
Dec 06 09:47:43 compute-1 sudo[108563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:43 compute-1 python3.9[108565]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:43 compute-1 sudo[108563]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:43 compute-1 ceph-mon[79770]: pgmap v186: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:47:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:43 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:43 compute-1 sudo[108715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdywcuptcgrgixlnrxsqkoxnozgqzxgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014463.5065153-900-164187693192183/AnsiballZ_stat.py'
Dec 06 09:47:43 compute-1 sudo[108715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:44 compute-1 python3.9[108717]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:44 compute-1 sudo[108715]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:44 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:44.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:44 compute-1 sudo[108794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hshyuxgfdeuhpbjzkymonpjbrvbhcini ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014463.5065153-900-164187693192183/AnsiballZ_file.py'
Dec 06 09:47:44 compute-1 sudo[108794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:44.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:44 compute-1 python3.9[108796]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:44 compute-1 sudo[108794]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:45 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:45 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:47:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:45 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:47:45 compute-1 sudo[108946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdmoyrndnreeoxnmdjwtypovbtluzvzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014464.8834858-936-254166248240759/AnsiballZ_stat.py'
Dec 06 09:47:45 compute-1 sudo[108946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:45 compute-1 python3.9[108948]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:45 compute-1 sudo[108946]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:45 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:46 compute-1 sudo[109024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlqouhvljqmbhhegarmmoklwgpsavkgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014464.8834858-936-254166248240759/AnsiballZ_file.py'
Dec 06 09:47:46 compute-1 sudo[109024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:46 compute-1 ceph-mon[79770]: pgmap v187: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s
Dec 06 09:47:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:46 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:46 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:47:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:46.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:46 compute-1 sudo[109028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:47:46 compute-1 sudo[109028]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:47:46 compute-1 sudo[109028]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:46 compute-1 python3.9[109026]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:46 compute-1 sudo[109024]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:46.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:47 compute-1 sudo[109202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbzfebuddwfzidcbaofqchqnhvvsojim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014466.6908836-972-15370727119203/AnsiballZ_stat.py'
Dec 06 09:47:47 compute-1 sudo[109202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:47 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:47 compute-1 ceph-mon[79770]: pgmap v188: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 06 09:47:47 compute-1 python3.9[109204]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:47 compute-1 sudo[109202]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:47 compute-1 sudo[109280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ialfrrmvmoqhagaqbkmtcnhfioxorynt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014466.6908836-972-15370727119203/AnsiballZ_file.py'
Dec 06 09:47:47 compute-1 sudo[109280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:47 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:47 compute-1 python3.9[109282]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:47 compute-1 sudo[109280]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:47:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:48 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:48.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:48 compute-1 sudo[109433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngybekdlxqbkittatjccgjkdqkxtueme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014467.9581642-1008-73602651004937/AnsiballZ_stat.py'
Dec 06 09:47:48 compute-1 sudo[109433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:48.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:48 compute-1 python3.9[109435]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:47:48 compute-1 sudo[109433]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:48 compute-1 ceph-mon[79770]: pgmap v189: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:47:48 compute-1 sudo[109511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdcxewgzbtlnebxmqtklqgawgnfrmzrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014467.9581642-1008-73602651004937/AnsiballZ_file.py'
Dec 06 09:47:48 compute-1 sudo[109511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:49 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:49 compute-1 python3.9[109513]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:49 compute-1 sudo[109511]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:49 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.378767) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469379053, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1506, "num_deletes": 250, "total_data_size": 3823377, "memory_usage": 3879432, "flush_reason": "Manual Compaction"}
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469404610, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1471945, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10666, "largest_seqno": 12167, "table_properties": {"data_size": 1467153, "index_size": 2188, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12146, "raw_average_key_size": 20, "raw_value_size": 1456834, "raw_average_value_size": 2407, "num_data_blocks": 97, "num_entries": 605, "num_filter_entries": 605, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014324, "oldest_key_time": 1765014324, "file_creation_time": 1765014469, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 25916 microseconds, and 9121 cpu microseconds.
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.404756) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1471945 bytes OK
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.404811) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.417118) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.417224) EVENT_LOG_v1 {"time_micros": 1765014469417210, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.417264) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 3816377, prev total WAL file size 3816377, number of live WAL files 2.
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.419252) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323531' seq:0, type:0; will stop at (end)
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1437KB)], [21(14MB)]
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469419509, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 16265369, "oldest_snapshot_seqno": -1}
Dec 06 09:47:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:49 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4281 keys, 14215931 bytes, temperature: kUnknown
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469618650, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 14215931, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14183043, "index_size": 21066, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10757, "raw_key_size": 108825, "raw_average_key_size": 25, "raw_value_size": 14100564, "raw_average_value_size": 3293, "num_data_blocks": 902, "num_entries": 4281, "num_filter_entries": 4281, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765014469, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.619084) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 14215931 bytes
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.621346) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 81.6 rd, 71.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 14.1 +0.0 blob) out(13.6 +0.0 blob), read-write-amplify(20.7) write-amplify(9.7) OK, records in: 4730, records dropped: 449 output_compression: NoCompression
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.621378) EVENT_LOG_v1 {"time_micros": 1765014469621361, "job": 10, "event": "compaction_finished", "compaction_time_micros": 199280, "compaction_time_cpu_micros": 59394, "output_level": 6, "num_output_files": 1, "total_output_size": 14215931, "num_input_records": 4730, "num_output_records": 4281, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469621810, "job": 10, "event": "table_file_deletion", "file_number": 23}
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014469624307, "job": 10, "event": "table_file_deletion", "file_number": 21}
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.418880) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.624355) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.624361) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.624362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.624364) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:47:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:47:49.624365) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:47:49 compute-1 sudo[109663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqzpsloegpchuqfoynwagxgwqovkzjzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014469.45925-1047-208023497483343/AnsiballZ_command.py'
Dec 06 09:47:49 compute-1 sudo[109663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:49 compute-1 python3.9[109665]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:47:50 compute-1 sudo[109663]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:50 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:50.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:50.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:50 compute-1 sudo[109821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpmttpxblqbzswacgyhtjowvvgmaazqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014470.2590823-1071-59833548921078/AnsiballZ_blockinfile.py'
Dec 06 09:47:50 compute-1 sudo[109821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:50 compute-1 python3.9[109823]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:50 compute-1 sudo[109821]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:51 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:51 compute-1 ceph-mon[79770]: pgmap v190: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:47:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:51 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:51 compute-1 sudo[109973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgkquaufycswtaklifqmdklrrkenzpml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014471.2600243-1098-219368244741738/AnsiballZ_file.py'
Dec 06 09:47:51 compute-1 sudo[109973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:51 compute-1 sudo[109976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:47:51 compute-1 sudo[109976]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:47:51 compute-1 sudo[109976]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:51 compute-1 python3.9[109975]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:51 compute-1 sudo[109973]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:52 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:52.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:52 compute-1 sudo[110151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlnidbcclluuxruuxofzitbbvtvpdlzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014471.9617677-1098-267757523632410/AnsiballZ_file.py'
Dec 06 09:47:52 compute-1 sudo[110151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:52 compute-1 python3.9[110153]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:47:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:52.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:52 compute-1 sudo[110151]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:52 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:47:52 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:47:52 compute-1 ceph-mon[79770]: pgmap v191: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:47:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:47:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:53 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:53 compute-1 sudo[110303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuwcncdhvtedmvtaeqfpecytkrgobtnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014472.6928482-1143-83906321582165/AnsiballZ_mount.py'
Dec 06 09:47:53 compute-1 sudo[110303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:53 compute-1 python3.9[110305]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 06 09:47:53 compute-1 sudo[110303]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:53 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:53 compute-1 sudo[110455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfpgpsjnyehyirwfsecmrixrsllkmzpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014473.6141026-1143-83698061045273/AnsiballZ_mount.py'
Dec 06 09:47:53 compute-1 sudo[110455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:47:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:47:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:54 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38001d30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:54 compute-1 python3.9[110457]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 06 09:47:54 compute-1 sudo[110455]: pam_unix(sudo:session): session closed for user root
Dec 06 09:47:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:54.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:54.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:54 compute-1 sshd-session[103144]: Connection closed by 192.168.122.30 port 60946
Dec 06 09:47:54 compute-1 sshd-session[103141]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:47:54 compute-1 systemd[1]: session-43.scope: Deactivated successfully.
Dec 06 09:47:54 compute-1 systemd[1]: session-43.scope: Consumed 31.935s CPU time.
Dec 06 09:47:54 compute-1 systemd-logind[788]: Session 43 logged out. Waiting for processes to exit.
Dec 06 09:47:54 compute-1 systemd-logind[788]: Removed session 43.
Dec 06 09:47:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094755 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:47:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:55 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:55 compute-1 ceph-mon[79770]: pgmap v192: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.9 KiB/s rd, 1023 B/s wr, 4 op/s
Dec 06 09:47:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:55 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:56 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:56.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:47:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:56.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:47:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:57 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:57 compute-1 ceph-mon[79770]: pgmap v193: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 426 B/s wr, 2 op/s
Dec 06 09:47:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:57 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38001d30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:47:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:58 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:47:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:47:58.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:47:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:47:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:47:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:47:58.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:47:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:59 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:47:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:47:59 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:00 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:48:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:00.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:48:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:48:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:00.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:48:00 compute-1 sshd-session[110486]: Accepted publickey for zuul from 192.168.122.30 port 49608 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:48:00 compute-1 systemd-logind[788]: New session 44 of user zuul.
Dec 06 09:48:00 compute-1 systemd[1]: Started Session 44 of User zuul.
Dec 06 09:48:00 compute-1 sshd-session[110486]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:48:00 compute-1 ceph-mon[79770]: pgmap v194: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 426 B/s wr, 2 op/s
Dec 06 09:48:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:01 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:01 compute-1 sudo[110639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgeqxiruhkljyvfsukywskvdperetbkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014480.7628474-19-221353108153450/AnsiballZ_tempfile.py'
Dec 06 09:48:01 compute-1 sudo[110639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:01 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:01 compute-1 python3.9[110641]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 06 09:48:01 compute-1 sudo[110639]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:01 compute-1 ceph-mon[79770]: pgmap v195: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:48:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:02 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:02 compute-1 sudo[110792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpsommkrtkwnbewphvozwmgvudtdvtoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014481.7990386-55-237400631378646/AnsiballZ_stat.py'
Dec 06 09:48:02 compute-1 sudo[110792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:48:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:02.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:48:02 compute-1 python3.9[110794]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:48:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:48:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:02.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:48:02 compute-1 sudo[110792]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:02 compute-1 ceph-mon[79770]: pgmap v196: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:48:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:48:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:03 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:03 compute-1 sudo[110946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rawhjjcxpjlrcfqxkgsgwczlxjpguzve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014482.7203114-79-280535186176801/AnsiballZ_slurp.py'
Dec 06 09:48:03 compute-1 sudo[110946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:03 compute-1 python3.9[110948]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Dec 06 09:48:03 compute-1 sudo[110946]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:03 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:03 compute-1 sudo[111098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bowdyenmtrrcvtspoxzelplvfkupgsyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014483.5146506-103-145943217284033/AnsiballZ_stat.py'
Dec 06 09:48:03 compute-1 sudo[111098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:03 compute-1 python3.9[111100]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.ua2f2h2u follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:48:04 compute-1 sudo[111098]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:04 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:04.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:04 compute-1 sudo[111224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-desobdspspmdhwothrwgpbqvzmcmesjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014483.5146506-103-145943217284033/AnsiballZ_copy.py'
Dec 06 09:48:04 compute-1 sudo[111224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:04.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:04 compute-1 python3.9[111226]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.ua2f2h2u mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014483.5146506-103-145943217284033/.source.ua2f2h2u _original_basename=.uocet0ja follow=False checksum=741dc69011fb61b699872c865e152b9968457717 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:04 compute-1 sudo[111224]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:04 compute-1 ceph-mon[79770]: pgmap v197: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:48:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:05 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:05 compute-1 sudo[111376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmxofvfgbrtwokzfouinsyswtilmpxeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014484.7905219-148-40359438537705/AnsiballZ_setup.py'
Dec 06 09:48:05 compute-1 sudo[111376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:05 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:05 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 06 09:48:05 compute-1 python3.9[111378]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:48:05 compute-1 sudo[111376]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:48:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:06.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:48:06 compute-1 sudo[111481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:48:06 compute-1 sudo[111481]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:48:06 compute-1 sudo[111481]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:06 compute-1 sudo[111556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woybhqgfkkozlsuxeuzzhyjtpmfjwpbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014486.0672414-173-142018173983991/AnsiballZ_blockinfile.py'
Dec 06 09:48:06 compute-1 sudo[111556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:48:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:06.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:48:06 compute-1 python3.9[111558]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDneZurSARwLaZA1xEymzXlvVAPvP8u0PCrqXuMYD5ewImDDChRITnk4XHKT/DUfrSJf9/7oJsddEbLRjhCtedqrMZsCkWz1BxtCmPBuvz2LfFhEn27TjqYLctOVGigQGsj6ILvPOzzLiapd93yApWDmH6P0un/ltmdM0iZLygNpzG3HLF8STBXzlo/8slci69Em7XppcrOpl1TS7DaVlpNcRQvo9pFuIrbMD9g0DOdMwk5YCH6g7OzGWqq0gt0YUOztmsqxWHKav3E0SXAD/vkgRc/1ZCNGFNSvf0dIgimCF3xlNWrppnvNgQ1BRqiQ7RArlOp1bVg0Ugdce6f4TIrq36Ois2U5+/myF5WQ7l9hRMRvoP64hSSsRAIDobTI/zMStUP3iZPFngxDxwQtpydHfFGywBL9811c42U7JsGxE8890uOIDk/oOkyhSH6KHQCPFjmKBJ98nT01lgnXyFSNOqds6QOYBasUWNFWd2wS7YpTheGlVVM8bk/gB4K2L0=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOMkn8zp09tRuEaH/bUoP0rYj+dziM1KcqMKxOgM9K1U
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCrMdvJJYP0cflC7RDFsxwr66nSp9R7QU726CAfJcKLw6vHh8Z9Lw5wLH0kiaSpsb6SAPffloplHEDiwTOkghOc=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAiB67qk/R3IfGpcAH1Ojopc8KX94De+Kxs31cKQLD04X+4QRXPRdMxU85LOhN58eKoHaBi8cgqk7+dvRypGD5vbtbRN9r0VN7tGwiSQTlVFbEuhn0AEbnRwNAMWEEMHO9kEjufP4N2zEEhtQBXy9oO2tMX3+BX4Z3YZZMQyZUgohdBHp2VCul9VdRuo0oHSr8HHm0nN61dMjalnThmgkGAu5hG8qhkWT4i9hroSKBsR5kVBUFTqdXekYkVy4YIYfM2lBXiMOFHtvr1a+KOyIfgWMb7GBPW7oKqtzCfVgSbGaUhSvGzs1OWt3U/PjjapIlmDnwD5ukzVxWV5ldh0vA48tXh5R1wqAoN5/Y/RiAKaY2kd/fvtkhvVDGZluXOz5jJ02IFHm+v4dP3Ig8YOuS5BEkWFuJHkblW0t/+4siTHWwmGEuvUI6y8Gb2pGcBKsWCJtLePYzT09IAmrjwO0jAgbWy0nvCZ+SKlbBBrXP6OgNgMkA+GH9iGOl6FOuRok=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGYNj3LmNvR0emoQHuuy9NKXPivs/dznunVy8GExnJl8
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJhKmGSvg8FMw16qKPzk6Pyj+OHkN3bmk20mts1PdCRcNRnn9sT1DgI6U8Aze1tjGPujT4eDL+Y9r/hsrfM4qDc=
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCtvqYC0W0zPSX/plyJvm0q1VGDScYTNlcCdllukOe81JRfU3GhVusPZOX0xRSaLP/lmXtfqWcbBRCkLsmFrAo2EHn1CMqMr5WkhY4+rgApF+MGLDOUo57tlKZLPIwdL0SSY/Qv8lBfrqr7LUDZ7fTTTbqTzim/bncxg/u0KxSWBdvjfmYi13SwO65wDkFqSVYa3h8DNij6cRRjQ0fJuJ9Da860hmMnqo9GJMU6dq3zMXXn3YfuF4E4M0UQdlWmVW4EwBTzsfA1XYbSpW7VdRJw6esB4vZ9/Succj+XZiANoDqL9gXSEjNXVVWVbL/7aGJJF9LLQ3VVxmHdbYs1NcTI6Yy9d61zDJHnK/nlYHMhmAHxiDsZEpv0xF72LLzaI86xxvnbx4eUpnyW6LnKiUCYUAUrWIMpLiIbWUxeIoYmj9rqLhwlo5kCy7WdCYYEMTtGI53oIyU0EbXf/r4WAuzmqpVRPyc2Sd5tYD4aXh1JZLUcZy+NLR0Y4SA8RflKFcs=
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFDJYF6pUvFgGUbY2QEOHAq7ZEhRQJUqPTVPOuTyb476
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPJ19afQPeSMtr3O9L1fe5+bNzTAsOOCA5fLihUdryDYc29KKD+0XABHKIvqeefcCsIBjZRA//9OzCUftfvXK9A=
                                              create=True mode=0644 path=/tmp/ansible.ua2f2h2u state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:06 compute-1 sudo[111556]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:06 compute-1 ceph-mon[79770]: pgmap v198: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:07 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:07 compute-1 sudo[111708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcziuuwcsfnevovmruzltcsqqaybhubh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014486.8917472-197-11910855224401/AnsiballZ_command.py'
Dec 06 09:48:07 compute-1 sudo[111708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:07 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:07 compute-1 python3.9[111710]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.ua2f2h2u' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:48:07 compute-1 sudo[111708]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:07 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:48:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:08 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:08 compute-1 sudo[111862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hblsipbsyraxilmvauhvxahfvmfayblb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014487.7209435-221-178853528083856/AnsiballZ_file.py'
Dec 06 09:48:08 compute-1 sudo[111862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:08.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:08 compute-1 python3.9[111865]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.ua2f2h2u state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:08 compute-1 sudo[111862]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:48:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:08.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:48:08 compute-1 sshd-session[110489]: Connection closed by 192.168.122.30 port 49608
Dec 06 09:48:08 compute-1 sshd-session[110486]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:48:08 compute-1 systemd-logind[788]: Session 44 logged out. Waiting for processes to exit.
Dec 06 09:48:08 compute-1 systemd[1]: session-44.scope: Deactivated successfully.
Dec 06 09:48:08 compute-1 systemd[1]: session-44.scope: Consumed 5.280s CPU time.
Dec 06 09:48:08 compute-1 systemd-logind[788]: Removed session 44.
Dec 06 09:48:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:09 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:09 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:10 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:10.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:10 compute-1 ceph-mon[79770]: pgmap v199: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:48:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:10.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:48:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:11 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:11 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:12.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:48:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:12.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:48:12 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:48:12 compute-1 ceph-mon[79770]: pgmap v200: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:12 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:48:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:13 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:13 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38002a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:13 compute-1 ceph-mon[79770]: pgmap v201: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:14 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:14.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:14.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:15 compute-1 ceph-mon[79770]: pgmap v202: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:48:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:15 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:15 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:16 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:16.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:16.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:17 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:17 compute-1 ceph-mon[79770]: pgmap v203: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:17 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:17 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:48:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:18 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:18 compute-1 sshd-session[111894]: Accepted publickey for zuul from 192.168.122.30 port 42744 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:48:18 compute-1 systemd-logind[788]: New session 45 of user zuul.
Dec 06 09:48:18 compute-1 systemd[1]: Started Session 45 of User zuul.
Dec 06 09:48:18 compute-1 sshd-session[111894]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:48:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:18.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:18.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:19 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:19 compute-1 ceph-mon[79770]: pgmap v204: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:19 compute-1 python3.9[112048]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:48:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:19 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c20003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:20 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:20 compute-1 sudo[112203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wltpsmbzzzvemfuuayfjquhmdrjckkbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014499.7028472-57-227894569987135/AnsiballZ_systemd.py'
Dec 06 09:48:20 compute-1 sudo[112203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:20.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:20.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:20 compute-1 python3.9[112205]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 06 09:48:20 compute-1 sudo[112203]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:21 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:21 compute-1 ceph-mon[79770]: pgmap v205: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:21 compute-1 sudo[112358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iynnhbdxoezzevthqsmkuxrbydtwzxvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014500.8475122-81-227161916357647/AnsiballZ_systemd.py'
Dec 06 09:48:21 compute-1 sudo[112358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:21 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:21 compute-1 python3.9[112360]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:48:21 compute-1 sudo[112358]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:22 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c340008d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:48:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:22.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:48:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:48:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:22.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:48:22 compute-1 sudo[112513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvadvpsamotnqmutkgjyrnublprbztww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014501.8562055-108-262262593410099/AnsiballZ_command.py'
Dec 06 09:48:22 compute-1 sudo[112513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:22 compute-1 python3.9[112515]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:48:22 compute-1 sudo[112513]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:22 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:48:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:23 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440013a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:23 compute-1 sudo[112667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcjtmbbscesovwxicpwpstylusukmxmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014502.9803174-132-59183693793209/AnsiballZ_stat.py'
Dec 06 09:48:23 compute-1 sudo[112667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:23 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440013a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:23 compute-1 python3.9[112669]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:48:23 compute-1 sudo[112667]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:23 compute-1 ceph-mon[79770]: pgmap v206: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:24 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:48:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:24.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:48:24 compute-1 sudo[112820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icoiaybkdftvgpoowdpvimbunmzvdgom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014503.9761202-159-108416820819057/AnsiballZ_file.py'
Dec 06 09:48:24 compute-1 sudo[112820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:24.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:24 compute-1 python3.9[112822]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:24 compute-1 sudo[112820]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:24 compute-1 ceph-mon[79770]: pgmap v207: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:48:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:48:24 compute-1 sshd-session[112823]: Invalid user admin from 139.19.117.129 port 43072
Dec 06 09:48:24 compute-1 sshd-session[111898]: Connection closed by 192.168.122.30 port 42744
Dec 06 09:48:24 compute-1 sshd-session[111894]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:48:24 compute-1 systemd[1]: session-45.scope: Deactivated successfully.
Dec 06 09:48:24 compute-1 systemd[1]: session-45.scope: Consumed 3.921s CPU time.
Dec 06 09:48:24 compute-1 systemd-logind[788]: Session 45 logged out. Waiting for processes to exit.
Dec 06 09:48:24 compute-1 systemd-logind[788]: Removed session 45.
Dec 06 09:48:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:25 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:25 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:26 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c44002350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:26.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:26 compute-1 sudo[112850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:48:26 compute-1 sudo[112850]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:48:26 compute-1 sudo[112850]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:26.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:26 compute-1 ceph-mon[79770]: pgmap v208: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:27 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:27 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:48:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:28 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:28.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:28.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:28 compute-1 ceph-mon[79770]: pgmap v209: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:29 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:29 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:30 compute-1 sshd-session[112876]: Accepted publickey for zuul from 192.168.122.30 port 57236 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:48:30 compute-1 systemd-logind[788]: New session 46 of user zuul.
Dec 06 09:48:30 compute-1 systemd[1]: Started Session 46 of User zuul.
Dec 06 09:48:30 compute-1 sshd-session[112876]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:48:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:30 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:30.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:48:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:30.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:48:31 compute-1 ceph-mon[79770]: pgmap v210: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:31 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:31 compute-1 python3.9[113030]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:48:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:31 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:32 compute-1 sudo[113184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojktanghcisomjpgzifvkdbdyqerpqzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014511.7731028-63-4959267432680/AnsiballZ_setup.py'
Dec 06 09:48:32 compute-1 sudo[113184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:32 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:32.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:32 compute-1 python3.9[113186]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:48:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:32.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:32 compute-1 sudo[113184]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:48:33 compute-1 sudo[113269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwqymaovlsgireksxgibrhtbkuomyznn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014511.7731028-63-4959267432680/AnsiballZ_dnf.py'
Dec 06 09:48:33 compute-1 sudo[113269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:33 compute-1 ceph-mon[79770]: pgmap v211: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:33 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:33 compute-1 python3.9[113271]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 06 09:48:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:33 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c44002350 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:34 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:48:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:34.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:48:34 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:48:34 compute-1 sshd-session[112823]: Connection closed by invalid user admin 139.19.117.129 port 43072 [preauth]
Dec 06 09:48:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:34.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:34 compute-1 sudo[113269]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:35 compute-1 ceph-mon[79770]: pgmap v212: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:48:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:35 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:35 compute-1 python3.9[113424]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:48:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:35 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:36 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c44003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:36.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:48:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:36.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:48:36 compute-1 python3.9[113576]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:48:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:37 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c140032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:37 compute-1 ceph-mon[79770]: pgmap v213: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:37 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:37 compute-1 python3.9[113726]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:48:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:48:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:38 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:38 compute-1 python3.9[113876]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:48:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:38.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:38.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:38 compute-1 sshd-session[112879]: Connection closed by 192.168.122.30 port 57236
Dec 06 09:48:38 compute-1 sshd-session[112876]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:48:38 compute-1 systemd[1]: session-46.scope: Deactivated successfully.
Dec 06 09:48:38 compute-1 systemd[1]: session-46.scope: Consumed 6.154s CPU time.
Dec 06 09:48:38 compute-1 systemd-logind[788]: Session 46 logged out. Waiting for processes to exit.
Dec 06 09:48:38 compute-1 systemd-logind[788]: Removed session 46.
Dec 06 09:48:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:39 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c44003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:39 compute-1 ceph-mon[79770]: pgmap v214: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:48:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:39 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c140032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:40 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:48:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:40.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:48:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:40.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:41 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:41 compute-1 ceph-mon[79770]: pgmap v215: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:41 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c44003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:42 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:42.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:48:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:42.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:48:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:48:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:43 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:43 compute-1 ceph-mon[79770]: pgmap v216: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:43 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:43 compute-1 sshd-session[113904]: Accepted publickey for zuul from 192.168.122.30 port 46758 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:48:44 compute-1 systemd-logind[788]: New session 47 of user zuul.
Dec 06 09:48:44 compute-1 systemd[1]: Started Session 47 of User zuul.
Dec 06 09:48:44 compute-1 sshd-session[113904]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:48:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:44 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:44.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:44 compute-1 ceph-mon[79770]: pgmap v217: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:48:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:44.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:45 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:45 compute-1 python3.9[114058]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:48:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:45 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:46 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:48:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:46.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:48:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:46.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:46 compute-1 ceph-mon[79770]: pgmap v218: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:46 compute-1 sudo[114175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:48:46 compute-1 sudo[114175]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:48:46 compute-1 sudo[114175]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:46 compute-1 sudo[114238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkuswnarldfoccrecmqmbeyqdvdixiyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014526.260961-113-79363678278402/AnsiballZ_file.py'
Dec 06 09:48:46 compute-1 sudo[114238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:46 compute-1 python3.9[114240]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:48:46 compute-1 sudo[114238]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:47 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:47 compute-1 sudo[114390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooycrmvoowigtcssczvtgbjryravdpmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014527.1085386-113-21810636433621/AnsiballZ_file.py'
Dec 06 09:48:47 compute-1 sudo[114390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:47 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:47 compute-1 python3.9[114392]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:48:47 compute-1 sudo[114390]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:48:48 compute-1 sudo[114542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaqrplaplagadjfvksuqpslyenlxqncv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014527.7947607-158-104226358948331/AnsiballZ_stat.py'
Dec 06 09:48:48 compute-1 sudo[114542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:48 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:48:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:48.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:48:48 compute-1 python3.9[114544]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:48:48 compute-1 sudo[114542]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:48.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:48 compute-1 sudo[114666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bapbnnxlkummwkcmbllagrkfyolstyre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014527.7947607-158-104226358948331/AnsiballZ_copy.py'
Dec 06 09:48:48 compute-1 sudo[114666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:49 compute-1 ceph-mon[79770]: pgmap v219: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:49 compute-1 python3.9[114668]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014527.7947607-158-104226358948331/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=ce0f53450193d0e253b88fe8ddc0e5fff4cd2fd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:49 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:49 compute-1 sudo[114666]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:49 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:49 compute-1 sudo[114818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvkvamvzodaxotmyhetsvinbtjcxjdxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014529.2583728-158-63358720721383/AnsiballZ_stat.py'
Dec 06 09:48:49 compute-1 sudo[114818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:49 compute-1 python3.9[114820]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:48:49 compute-1 sudo[114818]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:50 compute-1 sudo[114941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeliyptppfybuqyczslpmolynokuucwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014529.2583728-158-63358720721383/AnsiballZ_copy.py'
Dec 06 09:48:50 compute-1 sudo[114941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:50 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:50 compute-1 python3.9[114943]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014529.2583728-158-63358720721383/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=f805cc6455e59702aa77bd6ffe81bb9b155b0be7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:50.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:50 compute-1 sudo[114941]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:50.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:50 compute-1 sudo[115094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkthdeoxvopjwaohmiebmmeeujpiwrmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014530.4886403-158-73806083441357/AnsiballZ_stat.py'
Dec 06 09:48:50 compute-1 sudo[115094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:50 compute-1 python3.9[115096]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:48:50 compute-1 sudo[115094]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:51 compute-1 ceph-mon[79770]: pgmap v220: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:51 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:51 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:51 compute-1 sudo[115217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crtxaumhalynmcgzqgqhicutmdxblocs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014530.4886403-158-73806083441357/AnsiballZ_copy.py'
Dec 06 09:48:51 compute-1 sudo[115217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:51 compute-1 python3.9[115219]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014530.4886403-158-73806083441357/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=cca74c031dd75057ea2d8bce881d587d45d382dd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:51 compute-1 sudo[115217]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:51 compute-1 sudo[115244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:48:51 compute-1 sudo[115244]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:48:51 compute-1 sudo[115244]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:52 compute-1 sudo[115281]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Dec 06 09:48:52 compute-1 sudo[115281]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:48:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:52 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:52 compute-1 sudo[115432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zolhvclxynfcoyttefxssrghympcdfjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014531.994263-301-211444920581901/AnsiballZ_file.py'
Dec 06 09:48:52 compute-1 sudo[115432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:52.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:52 compute-1 sudo[115281]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:52 compute-1 python3.9[115436]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:48:52 compute-1 sudo[115432]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:52 compute-1 sudo[115443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:48:52 compute-1 sudo[115443]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:48:52 compute-1 sudo[115443]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:52.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:52 compute-1 sudo[115473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:48:52 compute-1 sudo[115473]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:48:52 compute-1 sudo[115657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnchepmtkjwojfhxfuagccynhamclkwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014532.6216671-301-115883383495950/AnsiballZ_file.py'
Dec 06 09:48:52 compute-1 sudo[115657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:48:53 compute-1 python3.9[115660]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:48:53 compute-1 sudo[115473]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:53 compute-1 sudo[115657]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:53 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c440047d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:53 compute-1 ceph-mon[79770]: pgmap v221: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:48:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:48:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:48:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:48:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:48:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:48:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:48:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:48:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:48:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:48:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:48:53 compute-1 sudo[115826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xoonqietchhxdnkzlolwkishgoaiyqgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014533.2410674-346-274722918074657/AnsiballZ_stat.py'
Dec 06 09:48:53 compute-1 sudo[115826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:53 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c1c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:53 compute-1 python3.9[115828]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:48:53 compute-1 sudo[115826]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:54 compute-1 sudo[115949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqbsswmtpyjcrzxdgmtpzutkkdobagpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014533.2410674-346-274722918074657/AnsiballZ_copy.py'
Dec 06 09:48:54 compute-1 sudo[115949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:54 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:54 compute-1 python3.9[115951]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014533.2410674-346-274722918074657/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=b03d35dc7e50a7209707916f12027739ad55ce95 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:54 compute-1 sudo[115949]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:48:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:54.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:48:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:48:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:48:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:54.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:48:54 compute-1 sudo[116104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afjqzbcjvvikkyfcfbjmnypxwhcfbkfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014534.4098713-346-80246700132645/AnsiballZ_stat.py'
Dec 06 09:48:54 compute-1 sudo[116104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:54 compute-1 python3.9[116106]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:48:54 compute-1 sudo[116104]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:55 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:55 compute-1 sudo[116227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsmaklynjpfinguezzbfqsuepjgmbuic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014534.4098713-346-80246700132645/AnsiballZ_copy.py'
Dec 06 09:48:55 compute-1 sudo[116227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:55 compute-1 python3.9[116229]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014534.4098713-346-80246700132645/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=72139a22070e52361b83b34c98df3f4b6e2a8fd5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:55 compute-1 sudo[116227]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:55 compute-1 ceph-mon[79770]: pgmap v222: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:48:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:55 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34000b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:55 compute-1 sudo[116379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjphclewjjhzmvpdgzxlwwacaculiciv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014535.5992563-346-211906494804494/AnsiballZ_stat.py'
Dec 06 09:48:55 compute-1 sudo[116379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:56 compute-1 python3.9[116381]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:48:56 compute-1 sudo[116379]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:56 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:56.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:56 compute-1 ceph-mon[79770]: pgmap v223: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:56 compute-1 sudo[116503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoxsqicaglcqxmcwaqpwefehpytajbnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014535.5992563-346-211906494804494/AnsiballZ_copy.py'
Dec 06 09:48:56 compute-1 sudo[116503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:48:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:56.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:48:56 compute-1 python3.9[116505]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014535.5992563-346-211906494804494/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=37cc98c5d0534b14ac35b3e937f249b933b173c7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:56 compute-1 sudo[116503]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:57 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:57 compute-1 sudo[116655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdjxawnkxwrmeynivrwuxovhvytikony ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014536.9472272-478-266591134839847/AnsiballZ_file.py'
Dec 06 09:48:57 compute-1 sudo[116655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:57 compute-1 python3.9[116657]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:48:57 compute-1 sudo[116655]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:57 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34000b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:57 compute-1 sudo[116705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:48:57 compute-1 sudo[116705]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:48:57 compute-1 sudo[116705]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:57 compute-1 sudo[116832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okrinyilhzhjcqsuxsubnlzowmmpnogb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014537.5833657-478-203247550196445/AnsiballZ_file.py'
Dec 06 09:48:57 compute-1 sudo[116832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:48:58 compute-1 python3.9[116834]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:48:58 compute-1 sudo[116832]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:58 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:48:58.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:58 compute-1 sudo[116985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxxifrxivnjpewcvctfhjxtazszifknm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014538.2281365-527-182369146049692/AnsiballZ_stat.py'
Dec 06 09:48:58 compute-1 sudo[116985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:58 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:48:58 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:48:58 compute-1 ceph-mon[79770]: pgmap v224: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:48:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:48:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:48:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:48:58.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:48:58 compute-1 python3.9[116987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:48:58 compute-1 sudo[116985]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:58 compute-1 sudo[117108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pznpudumobgxfbepuauobrirohkgxqme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014538.2281365-527-182369146049692/AnsiballZ_copy.py'
Dec 06 09:48:58 compute-1 sudo[117108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:59 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:59 compute-1 python3.9[117110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014538.2281365-527-182369146049692/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=68af34295cc65bfb4aba41f49e48fcf0501e5a64 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:48:59 compute-1 sudo[117108]: pam_unix(sudo:session): session closed for user root
Dec 06 09:48:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:48:59 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:48:59 compute-1 sudo[117260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwjnnhwofyahewkcskfneohmkbsvlipp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014539.3632379-527-264476480564139/AnsiballZ_stat.py'
Dec 06 09:48:59 compute-1 sudo[117260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:48:59 compute-1 python3.9[117262]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:48:59 compute-1 sudo[117260]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:00 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34000b00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:00 compute-1 sudo[117384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugnuzuqluwgqrngxjlbfetqkpfnpkqwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014539.3632379-527-264476480564139/AnsiballZ_copy.py'
Dec 06 09:49:00 compute-1 sudo[117384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:00.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:00 compute-1 python3.9[117386]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014539.3632379-527-264476480564139/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=72139a22070e52361b83b34c98df3f4b6e2a8fd5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:00 compute-1 sudo[117384]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:00.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:00 compute-1 sudo[117536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rotqsjjjlpxxrciugbyrtdkbzorzauso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014540.6613958-527-91508043372638/AnsiballZ_stat.py'
Dec 06 09:49:00 compute-1 sudo[117536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:00 compute-1 ceph-mon[79770]: pgmap v225: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:49:01 compute-1 python3.9[117538]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:01 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:01 compute-1 sudo[117536]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:01 compute-1 sudo[117659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykndainvsppbocaiieaceeqnauumvmeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014540.6613958-527-91508043372638/AnsiballZ_copy.py'
Dec 06 09:49:01 compute-1 sudo[117659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:01 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:01 compute-1 python3.9[117661]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014540.6613958-527-91508043372638/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=25ab66de5377411fb5d5a9d8f4de1740dfe1b562 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:01 compute-1 sudo[117659]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:02 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:02.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:02.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:02 compute-1 sudo[117812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rukembjpuvocxjgtrcahmnbssokvtcmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014542.4350686-705-33943192438382/AnsiballZ_file.py'
Dec 06 09:49:02 compute-1 sudo[117812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:02 compute-1 python3.9[117814]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:49:02 compute-1 sudo[117812]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:49:03 compute-1 ceph-mon[79770]: pgmap v226: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:49:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:03 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:03 compute-1 sudo[117964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akedgcjbxceuajjsoswbkmreavdcawiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014543.0745337-729-178206841400791/AnsiballZ_stat.py'
Dec 06 09:49:03 compute-1 sudo[117964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:03 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:03 compute-1 python3.9[117966]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:03 compute-1 sudo[117964]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:03 compute-1 sudo[118087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tceagasnryozsijxbepkuedjvmefwkmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014543.0745337-729-178206841400791/AnsiballZ_copy.py'
Dec 06 09:49:03 compute-1 sudo[118087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:04 compute-1 python3.9[118089]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014543.0745337-729-178206841400791/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:04 compute-1 sudo[118087]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:04 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0092f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:04.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:04 compute-1 sudo[118240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnxzxrhkauqckqlnioqjcrmadpbxpfot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014544.2980857-776-47197133264555/AnsiballZ_file.py'
Dec 06 09:49:04 compute-1 sudo[118240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:04.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:04 compute-1 python3.9[118242]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:49:04 compute-1 sudo[118240]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:05 compute-1 ceph-mon[79770]: pgmap v227: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:49:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:05 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:05 compute-1 sudo[118392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvucikipcohxmzzeiednrpkuvkuiyhon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014544.91911-801-216701076085284/AnsiballZ_stat.py'
Dec 06 09:49:05 compute-1 sudo[118392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:05 compute-1 python3.9[118394]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:05 compute-1 sudo[118392]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:05 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:05 compute-1 sudo[118515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvlszrjlkkgviqfmpwjjepjdpctvwtnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014544.91911-801-216701076085284/AnsiballZ_copy.py'
Dec 06 09:49:05 compute-1 sudo[118515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:05 compute-1 python3.9[118517]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014544.91911-801-216701076085284/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:06 compute-1 sudo[118515]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:06 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:06.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:06 compute-1 sudo[118668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnbfatkwawlezoswsbkggfnsrgrlgezs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014546.2793214-850-122828308341981/AnsiballZ_file.py'
Dec 06 09:49:06 compute-1 sudo[118668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:06.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:06 compute-1 python3.9[118670]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:49:06 compute-1 sudo[118671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:49:06 compute-1 sudo[118671]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:49:06 compute-1 sudo[118668]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:06 compute-1 sudo[118671]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:07 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0092f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:07 compute-1 ceph-mon[79770]: pgmap v228: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:49:07 compute-1 sudo[118845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pibdaxygtqeyghrmywntuwvyoxpdbgzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014546.8989162-876-204554807211735/AnsiballZ_stat.py'
Dec 06 09:49:07 compute-1 sudo[118845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:07 compute-1 python3.9[118847]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:07 compute-1 sudo[118845]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:07 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:07 compute-1 sudo[118968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wawwogglvmqbixgnhhpwcoqtjvwhbegr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014546.8989162-876-204554807211735/AnsiballZ_copy.py'
Dec 06 09:49:07 compute-1 sudo[118968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:07 compute-1 python3.9[118970]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014546.8989162-876-204554807211735/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:07 compute-1 sudo[118968]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:07 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:49:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:08 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34002470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:08.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:08 compute-1 sudo[119121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtmtijtaskgciahvmggqmnclscvyjrkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014548.124256-917-120207996719049/AnsiballZ_file.py'
Dec 06 09:49:08 compute-1 sudo[119121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:08 compute-1 python3.9[119123]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:49:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:08.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:08 compute-1 sudo[119121]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:09 compute-1 sudo[119273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jikqmdnpaxyevzqudxkflgxieqlvqisi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014548.769744-938-263244329203227/AnsiballZ_stat.py'
Dec 06 09:49:09 compute-1 sudo[119273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:09 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:09 compute-1 ceph-mon[79770]: pgmap v229: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:49:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:49:09 compute-1 python3.9[119275]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:09 compute-1 sudo[119273]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:09 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0092f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:09 compute-1 sudo[119396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybmrnofkrlziqlnflfiqzriozakafily ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014548.769744-938-263244329203227/AnsiballZ_copy.py'
Dec 06 09:49:09 compute-1 sudo[119396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:09 compute-1 python3.9[119398]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014548.769744-938-263244329203227/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:09 compute-1 sudo[119396]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:10 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c38003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:10 compute-1 sudo[119549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhvucfsghkozdnsqjqarcuqimshunixt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014550.0084934-982-132754620844557/AnsiballZ_file.py'
Dec 06 09:49:10 compute-1 sudo[119549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:10.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:10 compute-1 python3.9[119551]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:49:10 compute-1 sudo[119549]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:10.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:10 compute-1 sudo[119701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfmhjbdfpnunhrgvmtbdojatmkqidkdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014550.656027-1007-197312393562822/AnsiballZ_stat.py'
Dec 06 09:49:10 compute-1 sudo[119701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:11 compute-1 python3.9[119703]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:11 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c34003900 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:11 compute-1 sudo[119701]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:11 compute-1 ceph-mon[79770]: pgmap v230: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:49:11 compute-1 sudo[119824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byitmhfqvwyttekeggfozmdkbpiopcpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014550.656027-1007-197312393562822/AnsiballZ_copy.py'
Dec 06 09:49:11 compute-1 sudo[119824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:11 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c14003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:11 compute-1 python3.9[119826]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014550.656027-1007-197312393562822/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:11 compute-1 sudo[119824]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:12 compute-1 sudo[119976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgboqwcxkybfbgkruplnduntvufaqvtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014551.9151485-1054-75079895273140/AnsiballZ_file.py'
Dec 06 09:49:12 compute-1 sudo[119976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[103099]: 06/12/2025 09:49:12 : epoch 6933fb9a : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f6c4c0092f0 fd 38 proxy ignored for local
Dec 06 09:49:12 compute-1 kernel: ganesha.nfsd[115953]: segfault at 50 ip 00007f6cfc73f32e sp 00007f6cc8ff8210 error 4 in libntirpc.so.5.8[7f6cfc724000+2c000] likely on CPU 1 (core 0, socket 1)
Dec 06 09:49:12 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 06 09:49:12 compute-1 systemd[1]: Started Process Core Dump (PID 119980/UID 0).
Dec 06 09:49:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:12.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:12 compute-1 python3.9[119978]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:49:12 compute-1 sudo[119976]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:12.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:12 compute-1 sudo[120131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oquknfeewleykuhoqbvfpytlhehnkffr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014552.5999053-1079-183806100396446/AnsiballZ_stat.py'
Dec 06 09:49:12 compute-1 sudo[120131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:12 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:49:13 compute-1 python3.9[120133]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:13 compute-1 sudo[120131]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:13 compute-1 ceph-mon[79770]: pgmap v231: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:49:13 compute-1 sudo[120254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnprsozhrjnwwcalwehdooeqstxrnwxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014552.5999053-1079-183806100396446/AnsiballZ_copy.py'
Dec 06 09:49:13 compute-1 sudo[120254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:13 compute-1 python3.9[120256]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014552.5999053-1079-183806100396446/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=22c202a539af259b977a1afda61dbc1fe0d1039c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:13 compute-1 sudo[120254]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:13 compute-1 systemd-coredump[119981]: Process 103103 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 61:
                                                    #0  0x00007f6cfc73f32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 06 09:49:13 compute-1 systemd[1]: systemd-coredump@3-119980-0.service: Deactivated successfully.
Dec 06 09:49:13 compute-1 systemd[1]: systemd-coredump@3-119980-0.service: Consumed 1.523s CPU time.
Dec 06 09:49:13 compute-1 podman[120285]: 2025-12-06 09:49:13.925897105 +0000 UTC m=+0.037514663 container died 85455d3243db1463a68bc3199c944543828c9d708094c65b0309507d9efc87ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0)
Dec 06 09:49:13 compute-1 systemd[1]: var-lib-containers-storage-overlay-cb68c9c4605a82178b215e3dd6a8db8454de491b3741ae6c6873e5884bb45d11-merged.mount: Deactivated successfully.
Dec 06 09:49:13 compute-1 podman[120285]: 2025-12-06 09:49:13.970101114 +0000 UTC m=+0.081718662 container remove 85455d3243db1463a68bc3199c944543828c9d708094c65b0309507d9efc87ab (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2)
Dec 06 09:49:13 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec 06 09:49:14 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec 06 09:49:14 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.154s CPU time.
Dec 06 09:49:14 compute-1 sshd-session[113907]: Connection closed by 192.168.122.30 port 46758
Dec 06 09:49:14 compute-1 sshd-session[113904]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:49:14 compute-1 systemd[1]: session-47.scope: Deactivated successfully.
Dec 06 09:49:14 compute-1 systemd[1]: session-47.scope: Consumed 22.841s CPU time.
Dec 06 09:49:14 compute-1 systemd-logind[788]: Session 47 logged out. Waiting for processes to exit.
Dec 06 09:49:14 compute-1 systemd-logind[788]: Removed session 47.
Dec 06 09:49:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:14.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:14.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:15 compute-1 ceph-mon[79770]: pgmap v232: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.209871) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556210053, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1079, "num_deletes": 251, "total_data_size": 2727903, "memory_usage": 2771408, "flush_reason": "Manual Compaction"}
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556226469, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 1766352, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12172, "largest_seqno": 13246, "table_properties": {"data_size": 1761486, "index_size": 2454, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10182, "raw_average_key_size": 19, "raw_value_size": 1751812, "raw_average_value_size": 3299, "num_data_blocks": 109, "num_entries": 531, "num_filter_entries": 531, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014470, "oldest_key_time": 1765014470, "file_creation_time": 1765014556, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 16664 microseconds, and 9100 cpu microseconds.
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.226551) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 1766352 bytes OK
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.226586) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.228101) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.228124) EVENT_LOG_v1 {"time_micros": 1765014556228116, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.228177) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 2722607, prev total WAL file size 2722607, number of live WAL files 2.
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.229517) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1724KB)], [24(13MB)]
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556229686, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 15982283, "oldest_snapshot_seqno": -1}
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4294 keys, 14025129 bytes, temperature: kUnknown
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556312772, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 14025129, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13993143, "index_size": 20164, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10757, "raw_key_size": 109889, "raw_average_key_size": 25, "raw_value_size": 13911384, "raw_average_value_size": 3239, "num_data_blocks": 852, "num_entries": 4294, "num_filter_entries": 4294, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765014556, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.313229) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 14025129 bytes
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.314632) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 192.0 rd, 168.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 13.6 +0.0 blob) out(13.4 +0.0 blob), read-write-amplify(17.0) write-amplify(7.9) OK, records in: 4812, records dropped: 518 output_compression: NoCompression
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.314653) EVENT_LOG_v1 {"time_micros": 1765014556314642, "job": 12, "event": "compaction_finished", "compaction_time_micros": 83249, "compaction_time_cpu_micros": 45166, "output_level": 6, "num_output_files": 1, "total_output_size": 14025129, "num_input_records": 4812, "num_output_records": 4294, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556315048, "job": 12, "event": "table_file_deletion", "file_number": 26}
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014556317929, "job": 12, "event": "table_file_deletion", "file_number": 24}
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.229350) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.318109) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.318118) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.318120) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.318122) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:49:16 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:49:16.318125) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:49:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:16.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:16.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:17 compute-1 ceph-mon[79770]: pgmap v233: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:49:17 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:49:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:18.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:18.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:19 compute-1 ceph-mon[79770]: pgmap v234: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:49:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094919 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:49:20 compute-1 sshd-session[120332]: Accepted publickey for zuul from 192.168.122.30 port 51690 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:49:20 compute-1 systemd-logind[788]: New session 48 of user zuul.
Dec 06 09:49:20 compute-1 systemd[1]: Started Session 48 of User zuul.
Dec 06 09:49:20 compute-1 sshd-session[120332]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:49:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:20.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:49:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:20.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:49:20 compute-1 sudo[120486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcicpsvcioqleblxmcasykqgywehaflo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014560.2436466-27-260721343770183/AnsiballZ_file.py'
Dec 06 09:49:20 compute-1 sudo[120486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:20 compute-1 python3.9[120488]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:20 compute-1 sudo[120486]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:21 compute-1 ceph-mon[79770]: pgmap v235: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:49:21 compute-1 sudo[120638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biaayajathkslqtjdmvzwtgtvlgsozeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014561.1660461-63-193965291504767/AnsiballZ_stat.py'
Dec 06 09:49:21 compute-1 sudo[120638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:21 compute-1 python3.9[120640]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:21 compute-1 sudo[120638]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:22 compute-1 sudo[120762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjbfmddrwakwykijgqadzgzecfugjgbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014561.1660461-63-193965291504767/AnsiballZ_copy.py'
Dec 06 09:49:22 compute-1 sudo[120762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:22.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:22 compute-1 python3.9[120764]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014561.1660461-63-193965291504767/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=944de880f37676f80f6e04a4864888bf3f7decbf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:22 compute-1 sudo[120762]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:22.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:22 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:49:22 compute-1 sudo[120914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kplhtzrzevbhkogviahcvxawjzojscac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014562.7279217-63-138893949447265/AnsiballZ_stat.py'
Dec 06 09:49:22 compute-1 sudo[120914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:23 compute-1 python3.9[120916]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:23 compute-1 sudo[120914]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:23 compute-1 ceph-mon[79770]: pgmap v236: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:49:23 compute-1 sudo[121037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhhouxczxjzzbsjpvylqbgwqglirabhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014562.7279217-63-138893949447265/AnsiballZ_copy.py'
Dec 06 09:49:23 compute-1 sudo[121037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:23 compute-1 python3.9[121039]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014562.7279217-63-138893949447265/.source.conf _original_basename=ceph.conf follow=False checksum=531c84d7e2c99e4f6cf7d56dd7b16abeaf31bfa1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:23 compute-1 sudo[121037]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:24 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 4.
Dec 06 09:49:24 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:49:24 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.154s CPU time.
Dec 06 09:49:24 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:49:24 compute-1 sshd-session[120335]: Connection closed by 192.168.122.30 port 51690
Dec 06 09:49:24 compute-1 sshd-session[120332]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:49:24 compute-1 systemd[1]: session-48.scope: Deactivated successfully.
Dec 06 09:49:24 compute-1 systemd[1]: session-48.scope: Consumed 2.806s CPU time.
Dec 06 09:49:24 compute-1 systemd-logind[788]: Session 48 logged out. Waiting for processes to exit.
Dec 06 09:49:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:49:24 compute-1 systemd-logind[788]: Removed session 48.
Dec 06 09:49:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:24.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:24 compute-1 podman[121110]: 2025-12-06 09:49:24.410599064 +0000 UTC m=+0.046797864 container create b9faa74dd3edb0bb8be8b8cf42ea2f255f223a99a2bba098a5ab376aa85c70c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1)
Dec 06 09:49:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f02a73c780017322d2fb31621bfb5c4ae34d571bdad46ee9ef8c108294e66d28/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 06 09:49:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f02a73c780017322d2fb31621bfb5c4ae34d571bdad46ee9ef8c108294e66d28/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:49:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f02a73c780017322d2fb31621bfb5c4ae34d571bdad46ee9ef8c108294e66d28/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:49:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f02a73c780017322d2fb31621bfb5c4ae34d571bdad46ee9ef8c108294e66d28/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:49:24 compute-1 podman[121110]: 2025-12-06 09:49:24.474200265 +0000 UTC m=+0.110399095 container init b9faa74dd3edb0bb8be8b8cf42ea2f255f223a99a2bba098a5ab376aa85c70c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 09:49:24 compute-1 podman[121110]: 2025-12-06 09:49:24.479605459 +0000 UTC m=+0.115804259 container start b9faa74dd3edb0bb8be8b8cf42ea2f255f223a99a2bba098a5ab376aa85c70c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 06 09:49:24 compute-1 bash[121110]: b9faa74dd3edb0bb8be8b8cf42ea2f255f223a99a2bba098a5ab376aa85c70c2
Dec 06 09:49:24 compute-1 podman[121110]: 2025-12-06 09:49:24.391644993 +0000 UTC m=+0.027843793 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:49:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 06 09:49:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 06 09:49:24 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:49:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 06 09:49:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 06 09:49:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 06 09:49:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 06 09:49:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 06 09:49:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:24 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:49:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:24.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:25 compute-1 ceph-mon[79770]: pgmap v237: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 425 B/s rd, 0 op/s
Dec 06 09:49:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:26.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:26.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:26 compute-1 sudo[121168]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:49:26 compute-1 sudo[121168]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:49:26 compute-1 sudo[121168]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:27 compute-1 ceph-mon[79770]: pgmap v238: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:49:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:49:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:28.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:28.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:29 compute-1 ceph-mon[79770]: pgmap v239: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Dec 06 09:49:30 compute-1 sshd-session[121194]: Accepted publickey for zuul from 192.168.122.30 port 39444 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:49:30 compute-1 systemd-logind[788]: New session 49 of user zuul.
Dec 06 09:49:30 compute-1 systemd[1]: Started Session 49 of User zuul.
Dec 06 09:49:30 compute-1 sshd-session[121194]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:49:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:30.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:30 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:49:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:30 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:49:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:30.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:31 compute-1 python3.9[121348]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:49:31 compute-1 ceph-mon[79770]: pgmap v240: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Dec 06 09:49:32 compute-1 sudo[121502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzemcztkmganqbfntizhbkbrsqphpxvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014571.6933777-63-53223516504239/AnsiballZ_file.py'
Dec 06 09:49:32 compute-1 sudo[121502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:32 compute-1 python3.9[121504]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:49:32 compute-1 sudo[121502]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:32.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:32 compute-1 ceph-mon[79770]: pgmap v241: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Dec 06 09:49:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:49:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:32.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:49:32 compute-1 sudo[121655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bazthdscvtnrlgfumideueyfxsqtvwiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014572.4996753-63-38750114762861/AnsiballZ_file.py'
Dec 06 09:49:32 compute-1 sudo[121655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:32 compute-1 python3.9[121657]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:49:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:49:32 compute-1 sudo[121655]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:33 compute-1 python3.9[121807]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:49:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:34.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:34 compute-1 sudo[121958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkoqaxjarcpgkljkmlvlbgpeqwapzdgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014574.0872316-132-247909775588620/AnsiballZ_seboolean.py'
Dec 06 09:49:34 compute-1 sudo[121958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:34.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:34 compute-1 python3.9[121960]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 06 09:49:35 compute-1 ceph-mon[79770]: pgmap v242: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 937 B/s wr, 3 op/s
Dec 06 09:49:35 compute-1 sudo[121958]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:49:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 7179 writes, 30K keys, 7179 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 7179 writes, 1333 syncs, 5.39 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 7179 writes, 30K keys, 7179 commit groups, 1.0 writes per commit group, ingest: 20.58 MB, 0.03 MB/s
                                           Interval WAL: 7179 writes, 1333 syncs, 5.39 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 06 09:49:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:36.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:49:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:49:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:36.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 06 09:49:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:36 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 09:49:36 compute-1 sudo[122127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxgjkzmmxlbyuplaeygcrctnpcqqscge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014576.5035217-162-270357927047641/AnsiballZ_setup.py'
Dec 06 09:49:36 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 06 09:49:36 compute-1 sudo[122127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:37 compute-1 python3.9[122129]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:49:37 compute-1 ceph-mon[79770]: pgmap v243: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:49:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:37 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:37 compute-1 sudo[122127]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:37 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58001970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:37 compute-1 sudo[122214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naybozoxztpvxpbdiyosrirbxdzxbfcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014576.5035217-162-270357927047641/AnsiballZ_dnf.py'
Dec 06 09:49:37 compute-1 sudo[122214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:37 compute-1 python3.9[122216]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:49:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:49:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:38 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:38.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:38.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:39 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f40000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:39 compute-1 ceph-mon[79770]: pgmap v244: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 09:49:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:49:39 compute-1 sudo[122214]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:39 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/094939 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:49:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:40 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:40 compute-1 sudo[122369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdndrocmcdtwpzsmwljlivwgmwixqeeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014579.6913033-198-97961550451692/AnsiballZ_systemd.py'
Dec 06 09:49:40 compute-1 sudo[122369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:40.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:40 compute-1 python3.9[122371]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:49:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:40.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:40 compute-1 sudo[122369]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:41 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:41 compute-1 ceph-mon[79770]: pgmap v245: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 09:49:41 compute-1 sudo[122524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uctyetupgstktdrubiyafwxhwxnpjslk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014580.9870415-222-137773720268687/AnsiballZ_edpm_nftables_snippet.py'
Dec 06 09:49:41 compute-1 sudo[122524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:41 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f400016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:41 compute-1 python3[122526]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                             rule:
                                               proto: udp
                                               dport: 4789
                                           - rule_name: 119 neutron geneve networks
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               state: ["UNTRACKED"]
                                           - rule_name: 120 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: OUTPUT
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                           - rule_name: 121 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: PREROUTING
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                            dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 06 09:49:41 compute-1 sudo[122524]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:42 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:42 compute-1 sudo[122677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzfbkrmfruyhbwjmkdrpeqlscproanst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014582.0580914-249-183272133789231/AnsiballZ_file.py'
Dec 06 09:49:42 compute-1 sudo[122677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:42.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:42 compute-1 python3.9[122679]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:42 compute-1 sudo[122677]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:42.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:49:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:43 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:43 compute-1 sudo[122829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egyktvmqmugzzvpqkeuaogifncskdaoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014582.736789-273-10464986386857/AnsiballZ_stat.py'
Dec 06 09:49:43 compute-1 sudo[122829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:43 compute-1 ceph-mon[79770]: pgmap v246: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 09:49:43 compute-1 python3.9[122831]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:43 compute-1 sudo[122829]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:43 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:43 compute-1 sudo[122907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjjhwgclohnacggzlvakvbnryfzqmrsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014582.736789-273-10464986386857/AnsiballZ_file.py'
Dec 06 09:49:43 compute-1 sudo[122907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:43 compute-1 python3.9[122909]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:43 compute-1 sudo[122907]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:44 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f400016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:44 compute-1 sudo[123060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orgaggfgxxcdjiaucuydejxdzgryapxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014584.0247953-309-254031906232860/AnsiballZ_stat.py'
Dec 06 09:49:44 compute-1 sudo[123060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:44.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:44 compute-1 python3.9[123062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:44 compute-1 sudo[123060]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:44.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:44 compute-1 sudo[123138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oobiidfmshkzjxbetcgnekqybeqxdyee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014584.0247953-309-254031906232860/AnsiballZ_file.py'
Dec 06 09:49:44 compute-1 sudo[123138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:45 compute-1 python3.9[123140]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.os1jjdd_ recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:45 compute-1 sudo[123138]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:45 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:45 compute-1 ceph-mon[79770]: pgmap v247: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 09:49:45 compute-1 sudo[123290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyymluzkuuwjvaxxazzxwnqhifqsygbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014585.2226975-345-96404405941098/AnsiballZ_stat.py'
Dec 06 09:49:45 compute-1 sudo[123290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:45 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:45 compute-1 python3.9[123292]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:45 compute-1 sudo[123290]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:46 compute-1 sudo[123368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpxmaatmnkjggjaxcbojvtpxhzazflgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014585.2226975-345-96404405941098/AnsiballZ_file.py'
Dec 06 09:49:46 compute-1 sudo[123368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:46 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f480016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:46 compute-1 python3.9[123370]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:46 compute-1 sudo[123368]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:46.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:46.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:46 compute-1 sudo[123479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:49:46 compute-1 sudo[123479]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:49:46 compute-1 sudo[123479]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:46 compute-1 sudo[123546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqgicdxjlxbcdtxugvyrjuqjdmaohgug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014586.5494034-384-145379773731189/AnsiballZ_command.py'
Dec 06 09:49:46 compute-1 sudo[123546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:47 compute-1 python3.9[123548]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:49:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:47 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f400016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:47 compute-1 sudo[123546]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:47 compute-1 ceph-mon[79770]: pgmap v248: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:49:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:47 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:47 compute-1 sudo[123699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rziclrtapzfxayyaoptsmnqfajjwfeia ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014587.361253-408-85263686347110/AnsiballZ_edpm_nftables_from_files.py'
Dec 06 09:49:47 compute-1 sudo[123699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:49:48 compute-1 python3[123701]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 06 09:49:48 compute-1 sudo[123699]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:48 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:48.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:48 compute-1 sudo[123852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmludpvffknzjbrfcpdoqepxraestfxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014588.2467675-432-65815724871084/AnsiballZ_stat.py'
Dec 06 09:49:48 compute-1 sudo[123852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:48.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:48 compute-1 python3.9[123854]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:48 compute-1 sudo[123852]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:49 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:49 compute-1 sudo[123977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcsdwilbyxfusgreiyrhbdkoclkezmit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014588.2467675-432-65815724871084/AnsiballZ_copy.py'
Dec 06 09:49:49 compute-1 sudo[123977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:49 compute-1 ceph-mon[79770]: pgmap v249: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:49:49 compute-1 python3.9[123979]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014588.2467675-432-65815724871084/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:49 compute-1 sudo[123977]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:49 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f40002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:50 compute-1 sudo[124129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yunknajokugmkvjsouakmxlilpoubnyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014589.6480289-477-13677852418489/AnsiballZ_stat.py'
Dec 06 09:49:50 compute-1 sudo[124129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:50 compute-1 python3.9[124131]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:50 compute-1 sudo[124129]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:50 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:50.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:50 compute-1 sudo[124255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcdczyyehwonnojonwljrayogtspiyvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014589.6480289-477-13677852418489/AnsiballZ_copy.py'
Dec 06 09:49:50 compute-1 sudo[124255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:50.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:50 compute-1 python3.9[124257]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014589.6480289-477-13677852418489/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:50 compute-1 sudo[124255]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:51 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:51 compute-1 ceph-mon[79770]: pgmap v250: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:49:51 compute-1 sudo[124407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvqrcqcrfxurdhmbdqlyxclilbqhsjpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014591.054216-522-66352359629040/AnsiballZ_stat.py'
Dec 06 09:49:51 compute-1 sudo[124407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:51 compute-1 python3.9[124409]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:51 compute-1 sudo[124407]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:51 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:51 compute-1 sudo[124532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nunvfqprawiokewtvscrowofzeejhadm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014591.054216-522-66352359629040/AnsiballZ_copy.py'
Dec 06 09:49:51 compute-1 sudo[124532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:52 compute-1 python3.9[124534]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014591.054216-522-66352359629040/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:52 compute-1 sudo[124532]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:52 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f40002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:52.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:52 compute-1 sudo[124685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfnwhbgpxiycltqubharlslcnmowjzlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014592.3433285-567-229076604709049/AnsiballZ_stat.py'
Dec 06 09:49:52 compute-1 sudo[124685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:52.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:52 compute-1 python3.9[124687]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:52 compute-1 sudo[124685]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:49:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:53 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:53 compute-1 sudo[124810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtngunnnqaytjsjbjlbmoyzabwsxzrob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014592.3433285-567-229076604709049/AnsiballZ_copy.py'
Dec 06 09:49:53 compute-1 sudo[124810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:53 compute-1 ceph-mon[79770]: pgmap v251: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:49:53 compute-1 python3.9[124812]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014592.3433285-567-229076604709049/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:53 compute-1 sudo[124810]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:53 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:54 compute-1 sudo[124962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qreamzyalaozxoabqmzcifbvlodktmwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014593.6718242-612-54795851242859/AnsiballZ_stat.py'
Dec 06 09:49:54 compute-1 sudo[124962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:54 compute-1 python3.9[124964]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:49:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:54 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:54 compute-1 sudo[124962]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:49:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:54.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:54 compute-1 sudo[125088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stxuwecjgfzlcvzkwcorrmysictronnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014593.6718242-612-54795851242859/AnsiballZ_copy.py'
Dec 06 09:49:54 compute-1 sudo[125088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:49:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:54.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:49:54 compute-1 python3.9[125090]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014593.6718242-612-54795851242859/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:54 compute-1 sudo[125088]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:55 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:55 compute-1 ceph-mon[79770]: pgmap v252: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:49:55 compute-1 sudo[125240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgqtvdwwfrdysdhcxoxunchjfiaweoor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014595.1325188-657-94469277379460/AnsiballZ_file.py'
Dec 06 09:49:55 compute-1 sudo[125240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:55 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:55 compute-1 python3.9[125242]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:55 compute-1 sudo[125240]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:56 compute-1 sudo[125392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdvfecifnntyprlfwsamfufczptkhkzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014595.889465-681-232086286225212/AnsiballZ_command.py'
Dec 06 09:49:56 compute-1 sudo[125392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:56 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:56 compute-1 python3.9[125395]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:49:56 compute-1 sudo[125392]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:56.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:56 compute-1 ceph-mon[79770]: pgmap v253: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:49:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:49:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:56.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:49:57 compute-1 sudo[125548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yztbdtooerrnomamaqnrsfskhdlypqmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014596.6922946-705-163298386157497/AnsiballZ_blockinfile.py'
Dec 06 09:49:57 compute-1 sudo[125548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:57 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:57 compute-1 python3.9[125550]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:49:57 compute-1 sudo[125548]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:57 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:57 compute-1 sudo[125606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:49:57 compute-1 sudo[125606]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:49:57 compute-1 sudo[125606]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:57 compute-1 sudo[125652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:49:57 compute-1 sudo[125652]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:49:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:49:58 compute-1 sudo[125750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrukpaxvloijjjctmoetgvkwnahnfmyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014597.7229261-732-249643761806057/AnsiballZ_command.py'
Dec 06 09:49:58 compute-1 sudo[125750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:58 compute-1 python3.9[125752]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:49:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:58 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f64001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:58 compute-1 sudo[125750]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:58 compute-1 sudo[125652]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:49:58.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:49:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:49:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:49:58.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:49:58 compute-1 sudo[125935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tudakzymzjnlwbqfpwxajwimlyncuvkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014598.4746459-756-31470705839198/AnsiballZ_stat.py'
Dec 06 09:49:58 compute-1 sudo[125935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:58 compute-1 python3.9[125937]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:49:58 compute-1 sudo[125935]: pam_unix(sudo:session): session closed for user root
Dec 06 09:49:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:59 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:59 compute-1 ceph-mon[79770]: pgmap v254: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:49:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:49:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:49:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:49:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:49:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:49:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:49:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:49:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:49:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:49:59 compute-1 sudo[126089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tialawgyqqehkuehlzxkxyaiwshqvxtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014599.1843271-780-8916779496825/AnsiballZ_command.py'
Dec 06 09:49:59 compute-1 sudo[126089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:49:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:49:59 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:49:59 compute-1 python3.9[126091]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:49:59 compute-1 sudo[126089]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:00 compute-1 ceph-mon[79770]: overall HEALTH_OK
Dec 06 09:50:00 compute-1 sudo[126244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlrjivmojehzronfvfmfdxaegorbnkzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014599.945167-804-194203830909540/AnsiballZ_file.py'
Dec 06 09:50:00 compute-1 sudo[126244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:00 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c0096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:00 compute-1 python3.9[126247]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:50:00 compute-1 sudo[126244]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:00.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:00.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:01 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:01 compute-1 ceph-mon[79770]: pgmap v255: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:50:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:01 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:01 compute-1 python3.9[126397]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:50:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:02 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:02.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:02 compute-1 ceph-mon[79770]: pgmap v256: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:50:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:02.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:02 compute-1 sudo[126549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gifhczasdbthvzotcnwlawtttqcvbwsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014602.506113-924-267039302387973/AnsiballZ_command.py'
Dec 06 09:50:02 compute-1 sudo[126549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:02 compute-1 python3.9[126551]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:f2:93:49:d5" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:50:02 compute-1 ovs-vsctl[126552]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:f2:93:49:d5 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 06 09:50:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:50:02 compute-1 sudo[126549]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:03 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:03 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f640025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:03 compute-1 sudo[126702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saffksxfvfhustvabdpbpwjkgppkglgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014603.403202-951-227425897931808/AnsiballZ_command.py'
Dec 06 09:50:03 compute-1 sudo[126702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:03 compute-1 python3.9[126704]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ovs-vsctl show | grep -q "Manager"
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:50:03 compute-1 sudo[126702]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:04 compute-1 sudo[126802]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:50:04 compute-1 sudo[126802]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:50:04 compute-1 sudo[126802]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:04 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:04 compute-1 sudo[126883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxnduxqqugycmsgxygsugyfmevolksys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014604.089825-975-94761349486878/AnsiballZ_command.py'
Dec 06 09:50:04 compute-1 sudo[126883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:04.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:04 compute-1 python3.9[126885]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:50:04 compute-1 ovs-vsctl[126886]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 06 09:50:04 compute-1 sudo[126883]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:04.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:05 compute-1 ceph-mon[79770]: pgmap v257: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:50:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:50:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:50:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:05 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:05 compute-1 python3.9[127036]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:50:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:05 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:05 compute-1 sudo[127188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgmlvweynccbbneiqecdinntemgpyejh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014605.5742986-1026-25493081108787/AnsiballZ_file.py'
Dec 06 09:50:05 compute-1 sudo[127188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:06 compute-1 python3.9[127190]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:50:06 compute-1 sudo[127188]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:06 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f640032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:06.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:06 compute-1 sudo[127341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njvfiijqniwnmdvodaujbdxzjqgtxcfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014606.3097057-1050-68241901158148/AnsiballZ_stat.py'
Dec 06 09:50:06 compute-1 sudo[127341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:06.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:06 compute-1 python3.9[127343]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:50:06 compute-1 sudo[127341]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:06 compute-1 sudo[127369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:50:06 compute-1 sudo[127369]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:50:06 compute-1 sudo[127369]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:07 compute-1 sudo[127444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdkgxpwkhlygtunrxyitlrweajuhsdls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014606.3097057-1050-68241901158148/AnsiballZ_file.py'
Dec 06 09:50:07 compute-1 sudo[127444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:07 compute-1 ceph-mon[79770]: pgmap v258: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:50:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:07 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f58002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:07 compute-1 python3.9[127446]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:50:07 compute-1 sudo[127444]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:07 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:07 compute-1 sudo[127596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arxdlyvlfxptqcnmmlarpqdrfwlfhagr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014607.4429576-1050-17837641884619/AnsiballZ_stat.py'
Dec 06 09:50:07 compute-1 sudo[127596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:07 compute-1 python3.9[127598]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:50:07 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:50:07 compute-1 sudo[127596]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:08 compute-1 sudo[127674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnozowhgzkayjttbbjhlluwowxjozmvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014607.4429576-1050-17837641884619/AnsiballZ_file.py'
Dec 06 09:50:08 compute-1 sudo[127674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:08 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:08 compute-1 python3.9[127677]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:50:08 compute-1 sudo[127674]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:08.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:08.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:08 compute-1 sudo[127828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbierkiqhbnsdurmjktjirzoaftfnylj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014608.5855541-1119-123598168359018/AnsiballZ_file.py'
Dec 06 09:50:08 compute-1 sudo[127828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:09 compute-1 python3.9[127830]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:50:09 compute-1 sudo[127828]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:09 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c00a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:09 compute-1 ceph-mon[79770]: pgmap v259: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:50:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:50:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:09 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f3c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:09 compute-1 sudo[127980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izazhqeniihixsmdotxtlgdrxsybrwoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014609.3789082-1143-56700880557744/AnsiballZ_stat.py'
Dec 06 09:50:09 compute-1 sudo[127980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:09 compute-1 python3.9[127982]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:50:09 compute-1 sudo[127980]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:10 compute-1 sudo[128058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfddlvddfllfmcafjqtebyuglzydthqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014609.3789082-1143-56700880557744/AnsiballZ_file.py'
Dec 06 09:50:10 compute-1 sudo[128058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:10 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f48003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:10 compute-1 python3.9[128060]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:50:10 compute-1 sudo[128058]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:10.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:10.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:10 compute-1 sudo[128211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtojkjmxxuozjjnimfbmzxbehobyfefx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014610.5040002-1179-216450569606014/AnsiballZ_stat.py'
Dec 06 09:50:10 compute-1 sudo[128211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:10 compute-1 python3.9[128213]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:50:10 compute-1 sudo[128211]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:11 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f640032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:11 compute-1 ceph-mon[79770]: pgmap v260: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:50:11 compute-1 sudo[128289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvrsitxshevslprfneeuwgzuuhdsjjdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014610.5040002-1179-216450569606014/AnsiballZ_file.py'
Dec 06 09:50:11 compute-1 sudo[128289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:11 compute-1 python3.9[128291]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:50:11 compute-1 sudo[128289]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:11 compute-1 kernel: ganesha.nfsd[122066]: segfault at 50 ip 00007f801805232e sp 00007f7fe37fd210 error 4 in libntirpc.so.5.8[7f8018037000+2c000] likely on CPU 6 (core 0, socket 6)
Dec 06 09:50:11 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 06 09:50:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[121125]: 06/12/2025 09:50:11 : epoch 6933fc24 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f7f6c00a3f0 fd 38 proxy ignored for local
Dec 06 09:50:11 compute-1 systemd[1]: Started Process Core Dump (PID 128316/UID 0).
Dec 06 09:50:12 compute-1 sudo[128443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnqetjfkplctoeyhzglbdxawelazcgdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014611.8307803-1215-35636811636146/AnsiballZ_systemd.py'
Dec 06 09:50:12 compute-1 sudo[128443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:12 compute-1 python3.9[128445]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:50:12 compute-1 systemd[1]: Reloading.
Dec 06 09:50:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:12.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:12 compute-1 systemd-rc-local-generator[128472]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:50:12 compute-1 systemd-sysv-generator[128475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:50:12 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:50:12 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 2212 writes, 13K keys, 2212 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s
                                           Cumulative WAL: 2212 writes, 2212 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2212 writes, 13K keys, 2212 commit groups, 1.0 writes per commit group, ingest: 38.35 MB, 0.06 MB/s
                                           Interval WAL: 2212 writes, 2212 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    113.2      0.19              0.09         6    0.032       0      0       0.0       0.0
                                             L6      1/0   13.38 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.0    106.1     93.7      0.70              0.28         5    0.140     21K   2281       0.0       0.0
                                            Sum      1/0   13.38 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.0     83.0     97.9      0.89              0.37        11    0.081     21K   2281       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.0     83.2     98.2      0.89              0.37        10    0.089     21K   2281       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    106.1     93.7      0.70              0.28         5    0.140     21K   2281       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    114.4      0.19              0.09         5    0.039       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.022, interval 0.022
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.09 GB write, 0.15 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.9 seconds
                                           Interval compaction: 0.09 GB write, 0.15 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.9 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fbbecff350#2 capacity: 304.00 MB usage: 1.62 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000112 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(91,1.41 MB,0.464475%) FilterBlock(11,71.42 KB,0.0229434%) IndexBlock(11,143.02 KB,0.045942%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 06 09:50:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:12.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:12 compute-1 sudo[128443]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:12 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:50:13 compute-1 systemd-coredump[128317]: Process 121129 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 41:
                                                    #0  0x00007f801805232e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 06 09:50:13 compute-1 systemd[1]: systemd-coredump@4-128316-0.service: Deactivated successfully.
Dec 06 09:50:13 compute-1 systemd[1]: systemd-coredump@4-128316-0.service: Consumed 1.545s CPU time.
Dec 06 09:50:13 compute-1 ceph-mon[79770]: pgmap v261: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:50:13 compute-1 podman[128567]: 2025-12-06 09:50:13.318367316 +0000 UTC m=+0.037156838 container died b9faa74dd3edb0bb8be8b8cf42ea2f255f223a99a2bba098a5ab376aa85c70c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:50:13 compute-1 systemd[1]: var-lib-containers-storage-overlay-f02a73c780017322d2fb31621bfb5c4ae34d571bdad46ee9ef8c108294e66d28-merged.mount: Deactivated successfully.
Dec 06 09:50:13 compute-1 podman[128567]: 2025-12-06 09:50:13.36296989 +0000 UTC m=+0.081759392 container remove b9faa74dd3edb0bb8be8b8cf42ea2f255f223a99a2bba098a5ab376aa85c70c2 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Dec 06 09:50:13 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec 06 09:50:13 compute-1 sudo[128665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfjykomgacmqlmlbundluukfclfoxsma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014613.1714134-1239-64509254799360/AnsiballZ_stat.py'
Dec 06 09:50:13 compute-1 sudo[128665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:13 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec 06 09:50:13 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.886s CPU time.
Dec 06 09:50:13 compute-1 python3.9[128676]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:50:13 compute-1 sudo[128665]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:13 compute-1 sudo[128755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfjxyaetmealrboxwhaaammbcuwyzsya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014613.1714134-1239-64509254799360/AnsiballZ_file.py'
Dec 06 09:50:13 compute-1 sudo[128755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:14 compute-1 python3.9[128757]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:50:14 compute-1 sudo[128755]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:14.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:14.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:14 compute-1 sudo[128908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ridppfvhwyppyafztfufdvptuzojfnue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014614.47685-1275-265145239752272/AnsiballZ_stat.py'
Dec 06 09:50:14 compute-1 sudo[128908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:14 compute-1 python3.9[128910]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:50:14 compute-1 sudo[128908]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:15 compute-1 sudo[128986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaalvmonbxwhpsxrvhsjtaufvceytass ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014614.47685-1275-265145239752272/AnsiballZ_file.py'
Dec 06 09:50:15 compute-1 sudo[128986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:15 compute-1 ceph-mon[79770]: pgmap v262: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:50:15 compute-1 python3.9[128988]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:50:15 compute-1 sudo[128986]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:15 compute-1 sudo[129138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axektiiyykwvcmqyrpffdwblsbsoouwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014615.6299083-1311-268345404098590/AnsiballZ_systemd.py'
Dec 06 09:50:15 compute-1 sudo[129138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:16 compute-1 python3.9[129140]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:50:16 compute-1 systemd[1]: Reloading.
Dec 06 09:50:16 compute-1 systemd-sysv-generator[129171]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:50:16 compute-1 systemd-rc-local-generator[129167]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:50:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:16.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:16 compute-1 systemd[1]: Starting Create netns directory...
Dec 06 09:50:16 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:50:16 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:50:16 compute-1 systemd[1]: Finished Create netns directory.
Dec 06 09:50:16 compute-1 sudo[129138]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:16.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:17 compute-1 sudo[129331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gufgeojyajgcejfuuqhcdocshwceqwiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014616.898436-1341-44651452373101/AnsiballZ_file.py'
Dec 06 09:50:17 compute-1 sudo[129331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:17 compute-1 ceph-mon[79770]: pgmap v263: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:50:17 compute-1 python3.9[129333]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:50:17 compute-1 sudo[129331]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095017 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:50:17 compute-1 sudo[129483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzpxgryztpioeuktiscomkbbinccosnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014617.5573714-1365-189790454814158/AnsiballZ_stat.py'
Dec 06 09:50:17 compute-1 sudo[129483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:17 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:50:18 compute-1 python3.9[129485]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:50:18 compute-1 sudo[129483]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:18 compute-1 sudo[129607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grsaysjrwtuvpejiqghqlrjfrswaouur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014617.5573714-1365-189790454814158/AnsiballZ_copy.py'
Dec 06 09:50:18 compute-1 sudo[129607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:18.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:18 compute-1 python3.9[129609]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014617.5573714-1365-189790454814158/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:50:18 compute-1 sudo[129607]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:18.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:19 compute-1 sudo[129759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skuzzmkszbtttkexnluhnqppqnnbzubx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014619.0414326-1416-208747937725000/AnsiballZ_file.py'
Dec 06 09:50:19 compute-1 sudo[129759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:19 compute-1 ceph-mon[79770]: pgmap v264: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 09:50:19 compute-1 python3.9[129761]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:50:19 compute-1 sudo[129759]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:19 compute-1 sudo[129911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqeaiqpdohwnxjlboaubnvtekuqaxxaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014619.7168067-1440-214802979455028/AnsiballZ_stat.py'
Dec 06 09:50:19 compute-1 sudo[129911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:20 compute-1 python3.9[129913]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:50:20 compute-1 sudo[129911]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:20.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:20 compute-1 sudo[130035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejnnezketwrerkxnfvmjukznlhfgqqoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014619.7168067-1440-214802979455028/AnsiballZ_copy.py'
Dec 06 09:50:20 compute-1 sudo[130035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:20 compute-1 python3.9[130037]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014619.7168067-1440-214802979455028/.source.json _original_basename=.zws48ewl follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:50:20 compute-1 sudo[130035]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:20.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:21 compute-1 sudo[130187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-borrvfxdqhpsilxpoajaopwzobbvdvic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014620.9442763-1485-45310280441222/AnsiballZ_file.py'
Dec 06 09:50:21 compute-1 sudo[130187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:21 compute-1 ceph-mon[79770]: pgmap v265: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:50:21 compute-1 python3.9[130189]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:50:21 compute-1 sudo[130187]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:21 compute-1 sudo[130339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkrrjhyfmghlgufyaoqsljufomwtpftx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014621.6710157-1509-13504299740946/AnsiballZ_stat.py'
Dec 06 09:50:21 compute-1 sudo[130339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:22 compute-1 sudo[130339]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:22.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:22 compute-1 sudo[130463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxvjercxzyxhldabthjhtqdmvdoproaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014621.6710157-1509-13504299740946/AnsiballZ_copy.py'
Dec 06 09:50:22 compute-1 sudo[130463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:22 compute-1 sudo[130463]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:22.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:22 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:50:23 compute-1 ceph-mon[79770]: pgmap v266: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:50:23 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 5.
Dec 06 09:50:23 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:50:23 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.886s CPU time.
Dec 06 09:50:23 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:50:23 compute-1 sudo[130628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdaztsukqzmwwkiuebsnurolxjcnorfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014623.1183193-1560-79946849246308/AnsiballZ_container_config_data.py'
Dec 06 09:50:23 compute-1 sudo[130628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:23 compute-1 podman[130665]: 2025-12-06 09:50:23.778924366 +0000 UTC m=+0.045535468 container create aeb4a191b30e3d0e639fe714012cb8167b13d0245e7a274e7aa6d996a80dbf01 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:50:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40002f69ba8adcfb87c67e5821ee91e412b0c3574a69c61d93dc56a081e3f1b8/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 06 09:50:23 compute-1 python3.9[130636]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 06 09:50:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40002f69ba8adcfb87c67e5821ee91e412b0c3574a69c61d93dc56a081e3f1b8/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:50:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40002f69ba8adcfb87c67e5821ee91e412b0c3574a69c61d93dc56a081e3f1b8/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:50:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40002f69ba8adcfb87c67e5821ee91e412b0c3574a69c61d93dc56a081e3f1b8/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:50:23 compute-1 sudo[130628]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:23 compute-1 podman[130665]: 2025-12-06 09:50:23.758740848 +0000 UTC m=+0.025351990 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:50:23 compute-1 podman[130665]: 2025-12-06 09:50:23.861955329 +0000 UTC m=+0.128566461 container init aeb4a191b30e3d0e639fe714012cb8167b13d0245e7a274e7aa6d996a80dbf01 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 09:50:23 compute-1 podman[130665]: 2025-12-06 09:50:23.867714034 +0000 UTC m=+0.134325146 container start aeb4a191b30e3d0e639fe714012cb8167b13d0245e7a274e7aa6d996a80dbf01 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Dec 06 09:50:23 compute-1 bash[130665]: aeb4a191b30e3d0e639fe714012cb8167b13d0245e7a274e7aa6d996a80dbf01
Dec 06 09:50:23 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:50:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 06 09:50:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 06 09:50:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 06 09:50:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 06 09:50:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 06 09:50:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 06 09:50:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 06 09:50:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:50:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:50:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:24.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:24 compute-1 sudo[130873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzfbjlpoodkqbaefnoxmtdtihpsbzpft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014624.168061-1587-215583143321099/AnsiballZ_container_config_hash.py'
Dec 06 09:50:24 compute-1 sudo[130873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:24.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:24 compute-1 python3.9[130875]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:50:24 compute-1 sudo[130873]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:25 compute-1 ceph-mon[79770]: pgmap v267: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:50:25 compute-1 sudo[131025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujilfvjgfnsyctmesqkputgqenxiqzaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014625.0923517-1614-1159814726779/AnsiballZ_podman_container_info.py'
Dec 06 09:50:25 compute-1 sudo[131025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:25 compute-1 python3.9[131027]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 09:50:25 compute-1 sudo[131025]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:26 compute-1 ceph-mon[79770]: pgmap v268: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:50:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:26.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:26.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:27 compute-1 sudo[131132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:50:27 compute-1 sudo[131132]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:50:27 compute-1 sudo[131132]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:27 compute-1 sudo[131230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sngltxvxdgtrajadppwherlsmjrqmmax ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014626.9479084-1653-173158650708652/AnsiballZ_edpm_container_manage.py'
Dec 06 09:50:27 compute-1 sudo[131230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:27 compute-1 python3[131232]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:50:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:50:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:28.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:28.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:29 compute-1 ceph-mon[79770]: pgmap v269: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 596 B/s wr, 2 op/s
Dec 06 09:50:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:30 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:50:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:30 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:50:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:50:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:30.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:50:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:30.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:31 compute-1 ceph-mon[79770]: pgmap v270: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Dec 06 09:50:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:32.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:32.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:32 compute-1 podman[131246]: 2025-12-06 09:50:32.823378695 +0000 UTC m=+5.080776288 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c
Dec 06 09:50:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:50:33 compute-1 podman[131368]: 2025-12-06 09:50:33.011490176 +0000 UTC m=+0.072737454 container create 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 09:50:33 compute-1 podman[131368]: 2025-12-06 09:50:32.974223137 +0000 UTC m=+0.035470495 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c
Dec 06 09:50:33 compute-1 python3[131232]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c
Dec 06 09:50:33 compute-1 sudo[131230]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:33 compute-1 ceph-mon[79770]: pgmap v271: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 596 B/s wr, 1 op/s
Dec 06 09:50:33 compute-1 sudo[131555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yepqmyfpjeenhzhwyhpajwcnvsnqsxsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014633.5165198-1677-273446292787785/AnsiballZ_stat.py'
Dec 06 09:50:33 compute-1 sudo[131555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:34 compute-1 python3.9[131557]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:50:34 compute-1 sudo[131555]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:34.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:34.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:34 compute-1 sudo[131710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpalvqupkjilauaxnpftllbljbnehvht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014634.5008948-1704-253695840159599/AnsiballZ_file.py'
Dec 06 09:50:34 compute-1 sudo[131710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:35 compute-1 python3.9[131712]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:50:35 compute-1 sudo[131710]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:35 compute-1 sudo[131786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jttkbgvmlhmqphrxnzobkpdhrhxbqejf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014634.5008948-1704-253695840159599/AnsiballZ_stat.py'
Dec 06 09:50:35 compute-1 sudo[131786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:35 compute-1 ceph-mon[79770]: pgmap v272: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 937 B/s wr, 3 op/s
Dec 06 09:50:35 compute-1 python3.9[131788]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:50:35 compute-1 sudo[131786]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:36 compute-1 sudo[131937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcdvdcovwihdnygxbvppoxpbitunbrtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014635.641164-1704-110454569519603/AnsiballZ_copy.py'
Dec 06 09:50:36 compute-1 sudo[131937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:36 compute-1 python3.9[131939]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014635.641164-1704-110454569519603/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:50:36 compute-1 sudo[131937]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 06 09:50:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 09:50:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:36.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:36 compute-1 ceph-mon[79770]: pgmap v273: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:50:36 compute-1 sudo[132026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgeztstbsabbomcotfqqqtaruzwzdcfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014635.641164-1704-110454569519603/AnsiballZ_systemd.py'
Dec 06 09:50:36 compute-1 sudo[132026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:36.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:36 compute-1 python3.9[132028]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:50:36 compute-1 systemd[1]: Reloading.
Dec 06 09:50:37 compute-1 systemd-rc-local-generator[132055]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:50:37 compute-1 systemd-sysv-generator[132058]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:50:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:37 compute-1 sudo[132026]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:37 compute-1 sudo[132139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibouyfqhbddmaylayqqoxhfhafhbjfms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014635.641164-1704-110454569519603/AnsiballZ_systemd.py'
Dec 06 09:50:37 compute-1 sudo[132139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:37 compute-1 python3.9[132141]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:50:37 compute-1 systemd[1]: Reloading.
Dec 06 09:50:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:50:37 compute-1 systemd-sysv-generator[132174]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:50:37 compute-1 systemd-rc-local-generator[132170]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:50:38 compute-1 systemd[1]: Starting ovn_controller container...
Dec 06 09:50:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:38 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:38 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:50:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ca9670be7d4b8862f9a7ddfabd1ceaf608d59f574346d1575185ef1bc74ed2b/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 06 09:50:38 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c.
Dec 06 09:50:38 compute-1 podman[132184]: 2025-12-06 09:50:38.381974734 +0000 UTC m=+0.164369423 container init 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:50:38 compute-1 ovn_controller[132199]: + sudo -E kolla_set_configs
Dec 06 09:50:38 compute-1 podman[132184]: 2025-12-06 09:50:38.412007551 +0000 UTC m=+0.194402250 container start 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Dec 06 09:50:38 compute-1 edpm-start-podman-container[132184]: ovn_controller
Dec 06 09:50:38 compute-1 systemd[1]: Created slice User Slice of UID 0.
Dec 06 09:50:38 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 06 09:50:38 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 06 09:50:38 compute-1 systemd[1]: Starting User Manager for UID 0...
Dec 06 09:50:38 compute-1 systemd[132242]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Dec 06 09:50:38 compute-1 edpm-start-podman-container[132182]: Creating additional drop-in dependency for "ovn_controller" (00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c)
Dec 06 09:50:38 compute-1 podman[132207]: 2025-12-06 09:50:38.505867987 +0000 UTC m=+0.079983347 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 06 09:50:38 compute-1 systemd[1]: 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c-7073a8bb21fd3bfa.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:50:38 compute-1 systemd[1]: 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c-7073a8bb21fd3bfa.service: Failed with result 'exit-code'.
Dec 06 09:50:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:38.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:38 compute-1 systemd[1]: Reloading.
Dec 06 09:50:38 compute-1 systemd[132242]: Queued start job for default target Main User Target.
Dec 06 09:50:38 compute-1 systemd-rc-local-generator[132289]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:50:38 compute-1 systemd-sysv-generator[132294]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:50:38 compute-1 systemd[132242]: Created slice User Application Slice.
Dec 06 09:50:38 compute-1 systemd[132242]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 06 09:50:38 compute-1 systemd[132242]: Started Daily Cleanup of User's Temporary Directories.
Dec 06 09:50:38 compute-1 systemd[132242]: Reached target Paths.
Dec 06 09:50:38 compute-1 systemd[132242]: Reached target Timers.
Dec 06 09:50:38 compute-1 systemd[132242]: Starting D-Bus User Message Bus Socket...
Dec 06 09:50:38 compute-1 systemd[132242]: Starting Create User's Volatile Files and Directories...
Dec 06 09:50:38 compute-1 systemd[132242]: Finished Create User's Volatile Files and Directories.
Dec 06 09:50:38 compute-1 systemd[132242]: Listening on D-Bus User Message Bus Socket.
Dec 06 09:50:38 compute-1 systemd[132242]: Reached target Sockets.
Dec 06 09:50:38 compute-1 systemd[132242]: Reached target Basic System.
Dec 06 09:50:38 compute-1 systemd[132242]: Reached target Main User Target.
Dec 06 09:50:38 compute-1 systemd[132242]: Startup finished in 135ms.
Dec 06 09:50:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:38.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:38 compute-1 systemd[1]: Started User Manager for UID 0.
Dec 06 09:50:38 compute-1 systemd[1]: Started ovn_controller container.
Dec 06 09:50:38 compute-1 systemd[1]: Started Session c1 of User root.
Dec 06 09:50:38 compute-1 sudo[132139]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:38 compute-1 ovn_controller[132199]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:50:38 compute-1 ovn_controller[132199]: INFO:__main__:Validating config file
Dec 06 09:50:38 compute-1 ovn_controller[132199]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:50:38 compute-1 ovn_controller[132199]: INFO:__main__:Writing out command to execute
Dec 06 09:50:38 compute-1 systemd[1]: session-c1.scope: Deactivated successfully.
Dec 06 09:50:38 compute-1 ovn_controller[132199]: ++ cat /run_command
Dec 06 09:50:38 compute-1 ovn_controller[132199]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 06 09:50:38 compute-1 ovn_controller[132199]: + ARGS=
Dec 06 09:50:38 compute-1 ovn_controller[132199]: + sudo kolla_copy_cacerts
Dec 06 09:50:38 compute-1 systemd[1]: Started Session c2 of User root.
Dec 06 09:50:38 compute-1 ovn_controller[132199]: + [[ ! -n '' ]]
Dec 06 09:50:38 compute-1 systemd[1]: session-c2.scope: Deactivated successfully.
Dec 06 09:50:38 compute-1 ovn_controller[132199]: + . kolla_extend_start
Dec 06 09:50:38 compute-1 ovn_controller[132199]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 06 09:50:38 compute-1 ovn_controller[132199]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec 06 09:50:38 compute-1 ovn_controller[132199]: + umask 0022
Dec 06 09:50:38 compute-1 ovn_controller[132199]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 06 09:50:38 compute-1 NetworkManager[48956]: <info>  [1765014638.9687] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Dec 06 09:50:38 compute-1 NetworkManager[48956]: <info>  [1765014638.9701] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 06 09:50:38 compute-1 NetworkManager[48956]: <info>  [1765014638.9722] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec 06 09:50:38 compute-1 NetworkManager[48956]: <info>  [1765014638.9731] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Dec 06 09:50:38 compute-1 NetworkManager[48956]: <info>  [1765014638.9737] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 06 09:50:38 compute-1 kernel: br-int: entered promiscuous mode
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00014|main|INFO|OVS feature set changed, force recompute.
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00022|main|INFO|OVS feature set changed, force recompute.
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 06 09:50:38 compute-1 ovn_controller[132199]: 2025-12-06T09:50:38Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 06 09:50:38 compute-1 NetworkManager[48956]: <info>  [1765014638.9924] manager: (ovn-1b31b2-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec 06 09:50:38 compute-1 NetworkManager[48956]: <info>  [1765014638.9933] manager: (ovn-127282-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Dec 06 09:50:38 compute-1 NetworkManager[48956]: <info>  [1765014638.9939] manager: (ovn-d39b5b-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Dec 06 09:50:39 compute-1 systemd-udevd[132338]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 09:50:39 compute-1 kernel: genev_sys_6081: entered promiscuous mode
Dec 06 09:50:39 compute-1 systemd-udevd[132339]: Network interface NamePolicy= disabled on kernel command line.
Dec 06 09:50:39 compute-1 NetworkManager[48956]: <info>  [1765014639.0154] device (genev_sys_6081): carrier: link connected
Dec 06 09:50:39 compute-1 NetworkManager[48956]: <info>  [1765014639.0162] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/22)
Dec 06 09:50:39 compute-1 ceph-mon[79770]: pgmap v274: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.9 KiB/s rd, 1023 B/s wr, 4 op/s
Dec 06 09:50:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:50:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:39 compute-1 sudo[132468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxdpadglamtssrqamdrungtauilcqklo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014639.0562449-1788-29235374890136/AnsiballZ_command.py'
Dec 06 09:50:39 compute-1 sudo[132468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:39 compute-1 python3.9[132470]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:50:39 compute-1 ovs-vsctl[132471]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 06 09:50:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095039 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:50:39 compute-1 sudo[132468]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:40 compute-1 sudo[132621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aucqyfvtptbxcogjfjfgbhmrhosoqeky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014639.8578146-1812-214072331871454/AnsiballZ_command.py'
Dec 06 09:50:40 compute-1 sudo[132621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:40 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:40 compute-1 python3.9[132623]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:50:40 compute-1 ovs-vsctl[132626]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 06 09:50:40 compute-1 sudo[132621]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:40.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:40.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:41 compute-1 ceph-mon[79770]: pgmap v275: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 426 B/s wr, 2 op/s
Dec 06 09:50:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83640016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:41 compute-1 sudo[132777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xttocwefdvqocbwiboslyxhlvmyacoit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014640.959108-1854-267813387761751/AnsiballZ_command.py'
Dec 06 09:50:41 compute-1 sudo[132777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:41 compute-1 python3.9[132779]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:50:41 compute-1 ovs-vsctl[132780]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 06 09:50:41 compute-1 sudo[132777]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:41 compute-1 sshd-session[121197]: Connection closed by 192.168.122.30 port 39444
Dec 06 09:50:41 compute-1 sshd-session[121194]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:50:41 compute-1 systemd[1]: session-49.scope: Deactivated successfully.
Dec 06 09:50:41 compute-1 systemd[1]: session-49.scope: Consumed 59.975s CPU time.
Dec 06 09:50:41 compute-1 systemd-logind[788]: Session 49 logged out. Waiting for processes to exit.
Dec 06 09:50:41 compute-1 systemd-logind[788]: Removed session 49.
Dec 06 09:50:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:42 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83600016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:42.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:42.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:50:43 compute-1 ceph-mon[79770]: pgmap v276: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 426 B/s wr, 2 op/s
Dec 06 09:50:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83640016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:44 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c001f40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:44.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:44.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:45 compute-1 ceph-mon[79770]: pgmap v277: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 426 B/s wr, 2 op/s
Dec 06 09:50:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:46 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83640016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:46.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:46.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:47 compute-1 ceph-mon[79770]: pgmap v278: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:50:47 compute-1 sudo[132808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:50:47 compute-1 sudo[132808]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:50:47 compute-1 sudo[132808]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c001f40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:47 compute-1 sshd-session[132833]: Accepted publickey for zuul from 192.168.122.30 port 59150 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:50:47 compute-1 systemd-logind[788]: New session 51 of user zuul.
Dec 06 09:50:47 compute-1 systemd[1]: Started Session 51 of User zuul.
Dec 06 09:50:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:47 compute-1 sshd-session[132833]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:50:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:50:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:48 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:48.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:48.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:49 compute-1 python3.9[132987]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:50:49 compute-1 ceph-mon[79770]: pgmap v279: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:50:49 compute-1 systemd[1]: Stopping User Manager for UID 0...
Dec 06 09:50:49 compute-1 systemd[132242]: Activating special unit Exit the Session...
Dec 06 09:50:49 compute-1 systemd[132242]: Stopped target Main User Target.
Dec 06 09:50:49 compute-1 systemd[132242]: Stopped target Basic System.
Dec 06 09:50:49 compute-1 systemd[132242]: Stopped target Paths.
Dec 06 09:50:49 compute-1 systemd[132242]: Stopped target Sockets.
Dec 06 09:50:49 compute-1 systemd[132242]: Stopped target Timers.
Dec 06 09:50:49 compute-1 systemd[132242]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 06 09:50:49 compute-1 systemd[132242]: Closed D-Bus User Message Bus Socket.
Dec 06 09:50:49 compute-1 systemd[132242]: Stopped Create User's Volatile Files and Directories.
Dec 06 09:50:49 compute-1 systemd[132242]: Removed slice User Application Slice.
Dec 06 09:50:49 compute-1 systemd[132242]: Reached target Shutdown.
Dec 06 09:50:49 compute-1 systemd[132242]: Finished Exit the Session.
Dec 06 09:50:49 compute-1 systemd[132242]: Reached target Exit the Session.
Dec 06 09:50:49 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Dec 06 09:50:49 compute-1 systemd[1]: Stopped User Manager for UID 0.
Dec 06 09:50:49 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 06 09:50:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:49 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 06 09:50:49 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 06 09:50:49 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 06 09:50:49 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Dec 06 09:50:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c001f40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:50 compute-1 sudo[133143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekgzhnqzfvaxhueoelcednvhyargkqqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014649.6265798-63-201306639382287/AnsiballZ_file.py'
Dec 06 09:50:50 compute-1 sudo[133143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:50 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:50 compute-1 ceph-mon[79770]: pgmap v280: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:50:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:50.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:50 compute-1 python3.9[133145]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:50:50 compute-1 sudo[133143]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:50.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:51 compute-1 sudo[133296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmcwdsqpogzbtioroldwnnoxlaxsoklb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014650.8735294-63-62292961171393/AnsiballZ_file.py'
Dec 06 09:50:51 compute-1 sudo[133296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:51 compute-1 python3.9[133298]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:50:51 compute-1 sudo[133296]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:51 compute-1 sudo[133448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nppfkogynitfkbgvzkzdxojgszlqtygm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014651.5683308-63-151754526061722/AnsiballZ_file.py'
Dec 06 09:50:51 compute-1 sudo[133448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:52 compute-1 python3.9[133450]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:50:52 compute-1 sudo[133448]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:52 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:52.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:52 compute-1 sudo[133601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkvohgngzxefjsmtsyzvjsnufcbxdyit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014652.3199246-63-124600564257608/AnsiballZ_file.py'
Dec 06 09:50:52 compute-1 sudo[133601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:50:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:52.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:50:52 compute-1 python3.9[133603]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:50:52 compute-1 sudo[133601]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:50:53 compute-1 ceph-mon[79770]: pgmap v281: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:50:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83600032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:53 compute-1 sudo[133753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkygkvkeywaxxdcviusyfyfcdnemmdbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014652.9594443-63-258800166017239/AnsiballZ_file.py'
Dec 06 09:50:53 compute-1 sudo[133753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:53 compute-1 python3.9[133755]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:50:53 compute-1 sudo[133753]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:50:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:54 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:54 compute-1 python3.9[133906]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:50:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:54.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:54.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:55 compute-1 sudo[134061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuwuwudajpnsrzjlbmscuiwmibchnfud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014654.7303803-195-254044510549168/AnsiballZ_seboolean.py'
Dec 06 09:50:55 compute-1 sudo[134061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:50:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c003340 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83600032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:55 compute-1 python3.9[134063]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 06 09:50:55 compute-1 ceph-mon[79770]: pgmap v282: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:50:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:56 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:56 compute-1 sudo[134061]: pam_unix(sudo:session): session closed for user root
Dec 06 09:50:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:56.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:56.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:56 compute-1 ceph-mon[79770]: pgmap v283: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:50:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:57 compute-1 python3.9[134215]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:50:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:50:58 compute-1 python3.9[134336]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014656.561391-219-187282155108088/.source follow=False _original_basename=haproxy.j2 checksum=cc5e97ea900947bff0c19d73b88d99840e041f49 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:50:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:58 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:58 compute-1 ceph-mon[79770]: pgmap v284: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:50:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:50:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:50:58.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:50:58 compute-1 python3.9[134487]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:50:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:50:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:50:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:50:58.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:50:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:50:59 compute-1 python3.9[134608]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014658.2773345-264-252777079797089/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:50:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:50:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:00 compute-1 sudo[134758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmvazjaupbvpojnbarjtswalkivlnzfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014659.771543-315-277341883498911/AnsiballZ_setup.py'
Dec 06 09:51:00 compute-1 sudo[134758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:00 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:00 compute-1 python3.9[134760]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:51:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:51:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:00.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:51:00 compute-1 sudo[134758]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:00.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:01 compute-1 sudo[134843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lplzauunymjpncxbcirojnmrwthzkinz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014659.771543-315-277341883498911/AnsiballZ_dnf.py'
Dec 06 09:51:01 compute-1 sudo[134843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:01 compute-1 python3.9[134845]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:51:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:02 compute-1 ceph-mon[79770]: pgmap v285: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:02 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:02.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:51:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:02.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:51:02 compute-1 sudo[134843]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:51:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:03 compute-1 ceph-mon[79770]: pgmap v286: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:04 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:04 compute-1 sudo[134925]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:51:04 compute-1 sudo[134925]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:51:04 compute-1 sudo[134925]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:04 compute-1 sudo[134950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:51:04 compute-1 sudo[134950]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:51:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:04.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:04.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:04 compute-1 ceph-mon[79770]: pgmap v287: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:04 compute-1 sudo[135067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjlvunjkigehgsmqlluqllsegheophnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014664.1108105-351-212823968876022/AnsiballZ_systemd.py'
Dec 06 09:51:04 compute-1 sudo[135067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:05 compute-1 sudo[134950]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:05 compute-1 python3.9[135069]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:51:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:05 compute-1 sudo[135067]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:51:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:51:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:51:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:51:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:51:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:51:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:51:06 compute-1 python3.9[135234]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:51:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:06 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:06 compute-1 python3.9[135356]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014665.531517-375-38292714523132/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:51:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:06.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:06.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:06 compute-1 ceph-mon[79770]: pgmap v288: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:07 compute-1 python3.9[135506]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:51:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:07 compute-1 sudo[135507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:51:07 compute-1 sudo[135507]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:51:07 compute-1 sudo[135507]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:07 compute-1 python3.9[135652]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014666.7077394-375-213848021789615/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:51:07 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:51:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:08 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:08.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:08 compute-1 ovn_controller[132199]: 2025-12-06T09:51:08Z|00025|memory|INFO|16128 kB peak resident set size after 29.8 seconds
Dec 06 09:51:08 compute-1 ovn_controller[132199]: 2025-12-06T09:51:08Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Dec 06 09:51:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:08.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:08 compute-1 podman[135679]: 2025-12-06 09:51:08.819315723 +0000 UTC m=+0.113505103 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 09:51:09 compute-1 ceph-mon[79770]: pgmap v289: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:51:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:51:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:09 compute-1 python3.9[135830]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:51:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:09 compute-1 python3.9[135951]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014668.86935-507-129788869047860/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:51:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:10 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:10 compute-1 sudo[136103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:51:10 compute-1 sudo[136103]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:51:10 compute-1 sudo[136103]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:10 compute-1 python3.9[136102]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:51:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:51:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:10.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:51:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:10.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:11 compute-1 python3.9[136248]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014670.0787635-507-57883399109260/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:51:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:11 compute-1 ceph-mon[79770]: pgmap v290: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:11 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:51:11 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:51:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:11 compute-1 python3.9[136398]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:51:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:12 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:12.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:12 compute-1 sudo[136551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcafbatweeolvzmbdgipusbugqoforbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014672.2808325-621-139280987930010/AnsiballZ_file.py'
Dec 06 09:51:12 compute-1 sudo[136551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:12.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:12 compute-1 python3.9[136553]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:51:12 compute-1 sudo[136551]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:12 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:51:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:13 compute-1 ceph-mon[79770]: pgmap v291: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:13 compute-1 sudo[136703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjlztdjewhepzylwvldemggucafztbuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014673.2013712-645-159852202490087/AnsiballZ_stat.py'
Dec 06 09:51:13 compute-1 sudo[136703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:13 compute-1 python3.9[136705]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:51:13 compute-1 sudo[136703]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:13 compute-1 sudo[136781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgkdmhrxjioyjcdsvwjwwqtovkiqixks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014673.2013712-645-159852202490087/AnsiballZ_file.py'
Dec 06 09:51:13 compute-1 sudo[136781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:14 compute-1 python3.9[136783]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:51:14 compute-1 sudo[136781]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:14 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:14.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:14 compute-1 sudo[136934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gygxuxojcgwixbdwhrulbcrvnmstyfoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014674.3441243-645-103005085162881/AnsiballZ_stat.py'
Dec 06 09:51:14 compute-1 sudo[136934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:14.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:14 compute-1 python3.9[136936]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:51:14 compute-1 sudo[136934]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:15 compute-1 sudo[137012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwmceagykssushfqpyfwlqcrfiwrsexs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014674.3441243-645-103005085162881/AnsiballZ_file.py'
Dec 06 09:51:15 compute-1 sudo[137012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:15 compute-1 ceph-mon[79770]: pgmap v292: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:15 compute-1 python3.9[137014]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:51:15 compute-1 sudo[137012]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:16 compute-1 sudo[137164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxnyhmzxpejfswfgreedxoscjimgvwtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014675.722654-714-156637168531532/AnsiballZ_file.py'
Dec 06 09:51:16 compute-1 sudo[137164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:16 compute-1 python3.9[137166]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:51:16 compute-1 sudo[137164]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:16 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:16.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:16.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:17 compute-1 sudo[137317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whdjhrpvhkqocdmwxdrjzxhnxcsswqne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014676.497184-738-198809510314256/AnsiballZ_stat.py'
Dec 06 09:51:17 compute-1 sudo[137317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:17 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:17 compute-1 python3.9[137319]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:51:17 compute-1 sudo[137317]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:17 compute-1 ceph-mon[79770]: pgmap v293: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:17 compute-1 sudo[137395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ediiupogoybwadpzlyhpxlzjiggklsts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014676.497184-738-198809510314256/AnsiballZ_file.py'
Dec 06 09:51:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:17 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:17 compute-1 sudo[137395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:17 compute-1 python3.9[137397]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:51:17 compute-1 sudo[137395]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:17 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:51:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:18 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:18 compute-1 sudo[137548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdcszmtilozrigbekbgcoegcxlzziurh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014678.11927-774-274800745291961/AnsiballZ_stat.py'
Dec 06 09:51:18 compute-1 sudo[137548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:18.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:18 compute-1 python3.9[137550]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:51:18 compute-1 sudo[137548]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:51:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:18.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:51:18 compute-1 sudo[137626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnudnbsqrqcwjotemwfuzyldszhloqat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014678.11927-774-274800745291961/AnsiballZ_file.py'
Dec 06 09:51:18 compute-1 sudo[137626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:19 compute-1 python3.9[137628]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:51:19 compute-1 sudo[137626]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:19 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:19 compute-1 ceph-mon[79770]: pgmap v294: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:51:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:19 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:19 compute-1 sudo[137778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvbcxccdubpiwzlfvrzzxfopptsjzlhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014679.377736-810-43141721959374/AnsiballZ_systemd.py'
Dec 06 09:51:19 compute-1 sudo[137778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:19 compute-1 python3.9[137780]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:51:20 compute-1 systemd[1]: Reloading.
Dec 06 09:51:20 compute-1 systemd-sysv-generator[137808]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:51:20 compute-1 systemd-rc-local-generator[137802]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:51:20 compute-1 sudo[137778]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:20 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:51:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:20.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:51:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:20.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:20 compute-1 sudo[137969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvryprhusccnybjngvkofhnyqjjpyqci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014680.6211221-834-162788021710703/AnsiballZ_stat.py'
Dec 06 09:51:20 compute-1 sudo[137969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:21 compute-1 python3.9[137971]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:51:21 compute-1 sudo[137969]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:21 compute-1 sudo[138047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivtcwizdwgianimapxkykunsvatnzumh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014680.6211221-834-162788021710703/AnsiballZ_file.py'
Dec 06 09:51:21 compute-1 sudo[138047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:21 compute-1 ceph-mon[79770]: pgmap v295: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:21 compute-1 python3.9[138049]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:51:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:21 compute-1 sudo[138047]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:22 compute-1 sudo[138199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgprpdvgctprkyywymdswamrboaheggn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014681.8285558-870-229807358888054/AnsiballZ_stat.py'
Dec 06 09:51:22 compute-1 sudo[138199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:22 compute-1 python3.9[138201]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:51:22 compute-1 sudo[138199]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:22 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:22 compute-1 ceph-mon[79770]: pgmap v296: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:22 compute-1 sudo[138278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnqrtitoezrkqcxvgkkaftykxegiqunt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014681.8285558-870-229807358888054/AnsiballZ_file.py'
Dec 06 09:51:22 compute-1 sudo[138278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:22.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:22 compute-1 python3.9[138280]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:51:22 compute-1 sudo[138278]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:51:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:22.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:51:22 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:51:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:23 compute-1 sudo[138430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulikeogybavemointxxpputmcjycjems ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014683.0330977-906-122755496344682/AnsiballZ_systemd.py'
Dec 06 09:51:23 compute-1 sudo[138430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:23 compute-1 python3.9[138432]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:51:23 compute-1 systemd[1]: Reloading.
Dec 06 09:51:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:23 compute-1 systemd-sysv-generator[138462]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:51:23 compute-1 systemd-rc-local-generator[138458]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:51:23 compute-1 systemd[1]: Starting Create netns directory...
Dec 06 09:51:23 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:51:23 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:51:23 compute-1 systemd[1]: Finished Create netns directory.
Dec 06 09:51:23 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:51:24 compute-1 sudo[138430]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:24 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:24.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:24 compute-1 sudo[138625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrpddgmfvtqjrivhycbwajctkqlwyail ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014684.3979988-936-244535602398142/AnsiballZ_file.py'
Dec 06 09:51:24 compute-1 sudo[138625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:24.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:24 compute-1 python3.9[138627]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:51:24 compute-1 sudo[138625]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:24 compute-1 ceph-mon[79770]: pgmap v297: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:25 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:25 compute-1 sudo[138777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hofpussgqhhbtvknynqwkjiwhsyvqqjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014685.1568-960-109428248298576/AnsiballZ_stat.py'
Dec 06 09:51:25 compute-1 sudo[138777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:25 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:25 compute-1 python3.9[138779]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:51:25 compute-1 sudo[138777]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:26 compute-1 sudo[138900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxcqguhiuzmluxocdagaajjksozwrsws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014685.1568-960-109428248298576/AnsiballZ_copy.py'
Dec 06 09:51:26 compute-1 sudo[138900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:26 compute-1 python3.9[138902]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765014685.1568-960-109428248298576/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:51:26 compute-1 sudo[138900]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:26 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:51:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:26.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:51:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:26.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:26 compute-1 sudo[139053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zodpzftmgjpkmtanwnbkaqiwbnylwdik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014686.7298484-1011-218843277015859/AnsiballZ_file.py'
Dec 06 09:51:26 compute-1 sudo[139053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:27 compute-1 ceph-mon[79770]: pgmap v298: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:27 compute-1 python3.9[139055]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:51:27 compute-1 sudo[139053]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:27 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:27 compute-1 sudo[139080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:51:27 compute-1 sudo[139080]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:51:27 compute-1 sudo[139080]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:27 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:27 compute-1 sudo[139231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfuvptdxanotczpdjekclexiexwnavsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014687.465574-1035-47698695027449/AnsiballZ_stat.py'
Dec 06 09:51:27 compute-1 sudo[139231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:27 compute-1 python3.9[139233]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:51:27 compute-1 sudo[139231]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:51:28 compute-1 sudo[139355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccsfisyhwkcztiuhirzmlrkmjxsmwpjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014687.465574-1035-47698695027449/AnsiballZ_copy.py'
Dec 06 09:51:28 compute-1 sudo[139355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:28 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:28 compute-1 python3.9[139357]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014687.465574-1035-47698695027449/.source.json _original_basename=._q1aj5vy follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:51:28 compute-1 sudo[139355]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:28.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:28.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:29 compute-1 sudo[139507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pufeqqkcqmscggdoybvittrwnlssxyik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014688.679473-1080-116308687292944/AnsiballZ_file.py'
Dec 06 09:51:29 compute-1 sudo[139507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:29 compute-1 ceph-mon[79770]: pgmap v299: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:51:29 compute-1 python3.9[139509]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:51:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:29 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:29 compute-1 sudo[139507]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:29 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:29 compute-1 sudo[139659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auyufrybmqyhpdyhvjmbaqyjtcivtcxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014689.5188744-1104-15958746678284/AnsiballZ_stat.py'
Dec 06 09:51:29 compute-1 sudo[139659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:30 compute-1 sudo[139659]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:30 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:30 compute-1 sudo[139783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkvmbcyxnnavkgaahvdcvcydyyccjejk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014689.5188744-1104-15958746678284/AnsiballZ_copy.py'
Dec 06 09:51:30 compute-1 sudo[139783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:30.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:30 compute-1 sudo[139783]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:30.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:31 compute-1 ceph-mon[79770]: pgmap v300: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:31 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:31 compute-1 sudo[139935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzsilhbtntdozqxrajdwouziahmddooi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014691.0033464-1155-255362432675720/AnsiballZ_container_config_data.py'
Dec 06 09:51:31 compute-1 sudo[139935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:31 compute-1 python3.9[139937]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 06 09:51:31 compute-1 sudo[139935]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:31 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:32 compute-1 sudo[140088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zghdxhqvydvgecmubhekhmolggagxkwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014691.924485-1182-78969625284482/AnsiballZ_container_config_hash.py'
Dec 06 09:51:32 compute-1 sudo[140088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:32 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:32 compute-1 python3.9[140090]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:51:32 compute-1 sudo[140088]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:32.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:51:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:32.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:51:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:51:33 compute-1 ceph-mon[79770]: pgmap v301: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:33 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:33 compute-1 sudo[140240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvjmnrlyvyvprwsdhufuyxzdocdkldor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014692.8084867-1209-256545908025877/AnsiballZ_podman_container_info.py'
Dec 06 09:51:33 compute-1 sudo[140240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:33 compute-1 python3.9[140242]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 09:51:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:33 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388002160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:33 compute-1 sudo[140240]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:34 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:34.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:34.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:35 compute-1 ceph-mon[79770]: pgmap v302: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:35 compute-1 sudo[140421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylattiqpvgauxmikyhjurcytkmoibgxv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765014694.6796167-1248-213226500672358/AnsiballZ_edpm_container_manage.py'
Dec 06 09:51:35 compute-1 sudo[140421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:35 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:35 compute-1 python3[140423]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:51:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:35 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388002160 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:36.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:51:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:36.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:51:37 compute-1 ceph-mon[79770]: pgmap v303: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:51:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:38 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:38.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:51:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:38.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:51:39 compute-1 ceph-mon[79770]: pgmap v304: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:51:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:51:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388002180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:40 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:40.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:40.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0023c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:41 compute-1 ceph-mon[79770]: pgmap v305: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:42 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:42.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:42.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:51:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:44 compute-1 podman[140501]: 2025-12-06 09:51:44.083467935 +0000 UTC m=+4.931364352 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 06 09:51:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:44 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:44.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:51:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:44.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:51:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:45 compute-1 ceph-mon[79770]: pgmap v306: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:46 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:46.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:46 compute-1 ceph-mon[79770]: pgmap v307: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:46 compute-1 podman[140435]: 2025-12-06 09:51:46.86485706 +0000 UTC m=+11.329815854 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3
Dec 06 09:51:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:46.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:47 compute-1 podman[140598]: 2025-12-06 09:51:47.047940419 +0000 UTC m=+0.068703644 container create 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:51:47 compute-1 podman[140598]: 2025-12-06 09:51:47.013107445 +0000 UTC m=+0.033870730 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3
Dec 06 09:51:47 compute-1 python3[140423]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3
Dec 06 09:51:47 compute-1 sudo[140421]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ec0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:47 compute-1 sudo[140661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:51:47 compute-1 sudo[140661]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:51:47 compute-1 sudo[140661]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:47 compute-1 ceph-mon[79770]: pgmap v308: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:47 compute-1 sudo[140811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlbyzszecgalegusuvavetttwpyxfuid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014707.4313645-1272-168264870139355/AnsiballZ_stat.py'
Dec 06 09:51:47 compute-1 sudo[140811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:47 compute-1 python3.9[140813]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:51:47 compute-1 sudo[140811]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:51:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:48 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:48 compute-1 sudo[140966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfagpokzgnulamdtojunrmxqwbghyyzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014708.2079897-1299-82123039678411/AnsiballZ_file.py'
Dec 06 09:51:48 compute-1 sudo[140966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:48.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:48 compute-1 python3.9[140968]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:51:48 compute-1 sudo[140966]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:48.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:48 compute-1 sudo[141042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyvogsoslvvocepzlzfjbhrvkorbcqot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014708.2079897-1299-82123039678411/AnsiballZ_stat.py'
Dec 06 09:51:48 compute-1 sudo[141042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:49 compute-1 python3.9[141044]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:51:49 compute-1 sudo[141042]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:49 compute-1 ceph-mon[79770]: pgmap v309: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:51:49 compute-1 sudo[141193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpsdimfkprrdxfjvztexxhatuwkzjtzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014709.207118-1299-223677700029897/AnsiballZ_copy.py'
Dec 06 09:51:49 compute-1 sudo[141193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0018c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:49 compute-1 python3.9[141195]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765014709.207118-1299-223677700029897/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:51:49 compute-1 sudo[141193]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:50 compute-1 sudo[141269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbhzqehypewlymmkyfonzqipuhgzdfek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014709.207118-1299-223677700029897/AnsiballZ_systemd.py'
Dec 06 09:51:50 compute-1 sudo[141269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:50 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:50 compute-1 python3.9[141271]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:51:50 compute-1 systemd[1]: Reloading.
Dec 06 09:51:50 compute-1 systemd-sysv-generator[141300]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:51:50 compute-1 systemd-rc-local-generator[141294]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:51:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:51:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:50.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:51:50 compute-1 sudo[141269]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:50.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:51 compute-1 sudo[141381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-judiazpcszlfsehuqbohlfwkfzeixeda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014709.207118-1299-223677700029897/AnsiballZ_systemd.py'
Dec 06 09:51:51 compute-1 sudo[141381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:51:51 compute-1 ceph-mon[79770]: pgmap v310: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:51 compute-1 python3.9[141383]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:51:51 compute-1 systemd[1]: Reloading.
Dec 06 09:51:51 compute-1 systemd-rc-local-generator[141415]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:51:51 compute-1 systemd-sysv-generator[141419]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:51:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:51 compute-1 systemd[1]: Starting ovn_metadata_agent container...
Dec 06 09:51:51 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:51:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbf687c04257492bf54ae160cdeb8f8c130ac17bc1e26ca1c1d96f233206af59/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 06 09:51:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bbf687c04257492bf54ae160cdeb8f8c130ac17bc1e26ca1c1d96f233206af59/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 06 09:51:51 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b.
Dec 06 09:51:51 compute-1 podman[141425]: 2025-12-06 09:51:51.983987137 +0000 UTC m=+0.182734561 container init 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: + sudo -E kolla_set_configs
Dec 06 09:51:52 compute-1 podman[141425]: 2025-12-06 09:51:52.015956319 +0000 UTC m=+0.214703713 container start 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:51:52 compute-1 edpm-start-podman-container[141425]: ovn_metadata_agent
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: INFO:__main__:Validating config file
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: INFO:__main__:Copying service configuration files
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: INFO:__main__:Writing out command to execute
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 06 09:51:52 compute-1 edpm-start-podman-container[141424]: Creating additional drop-in dependency for "ovn_metadata_agent" (4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b)
Dec 06 09:51:52 compute-1 podman[141448]: 2025-12-06 09:51:52.096931807 +0000 UTC m=+0.069438773 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: ++ cat /run_command
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: + CMD=neutron-ovn-metadata-agent
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: + ARGS=
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: + sudo kolla_copy_cacerts
Dec 06 09:51:52 compute-1 systemd[1]: Reloading.
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: + [[ ! -n '' ]]
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: + . kolla_extend_start
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: Running command: 'neutron-ovn-metadata-agent'
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: + umask 0022
Dec 06 09:51:52 compute-1 ovn_metadata_agent[141441]: + exec neutron-ovn-metadata-agent
Dec 06 09:51:52 compute-1 systemd-sysv-generator[141523]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:51:52 compute-1 systemd-rc-local-generator[141519]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:51:52 compute-1 systemd[1]: Started ovn_metadata_agent container.
Dec 06 09:51:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:52 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0018c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:52 compute-1 sudo[141381]: pam_unix(sudo:session): session closed for user root
Dec 06 09:51:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:52.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:52.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:52 compute-1 sshd-session[132836]: Connection closed by 192.168.122.30 port 59150
Dec 06 09:51:52 compute-1 sshd-session[132833]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:51:52 compute-1 systemd[1]: session-51.scope: Deactivated successfully.
Dec 06 09:51:52 compute-1 systemd[1]: session-51.scope: Consumed 1min 2.948s CPU time.
Dec 06 09:51:52 compute-1 systemd-logind[788]: Session 51 logged out. Waiting for processes to exit.
Dec 06 09:51:52 compute-1 systemd-logind[788]: Removed session 51.
Dec 06 09:51:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:51:53 compute-1 ceph-mon[79770]: pgmap v311: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.207 141446 INFO neutron.common.config [-] Logging enabled!
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.207 141446 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.208 141446 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.208 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.208 141446 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.208 141446 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.209 141446 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.210 141446 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.211 141446 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.212 141446 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.213 141446 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.214 141446 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.215 141446 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.215 141446 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.215 141446 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.215 141446 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.228 141446 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.229 141446 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.229 141446 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.229 141446 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.229 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.229 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.229 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.230 141446 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.231 141446 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.232 141446 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.233 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.234 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.235 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.236 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.237 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.238 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.239 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.240 141446 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.241 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.242 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.243 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.244 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.245 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.246 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.247 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.248 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.249 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.250 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.250 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.250 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.250 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.250 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.250 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.250 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.251 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.252 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.253 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.254 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.255 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.255 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.255 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.255 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.255 141446 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.255 141446 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.265 141446 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.266 141446 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.266 141446 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.266 141446 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.266 141446 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.281 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 61eba479-a995-4b31-88b9-8ebfcea9907e (UUID: 61eba479-a995-4b31-88b9-8ebfcea9907e) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.306 141446 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.307 141446 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.307 141446 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.307 141446 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.310 141446 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.317 141446 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.326 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '61eba479-a995-4b31-88b9-8ebfcea9907e'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f6d96f5b8b0>], external_ids={}, name=61eba479-a995-4b31-88b9-8ebfcea9907e, nb_cfg_timestamp=1765014646996, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.327 141446 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f6d96f4cf70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.328 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.328 141446 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.328 141446 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.329 141446 INFO oslo_service.service [-] Starting 1 workers
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.334 141446 DEBUG oslo_service.service [-] Started child 141558 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.339 141558 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-2001887'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.361 141446 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpgd8t6f9_/privsep.sock']
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.398 141558 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.399 141558 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.399 141558 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.403 141558 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 06 09:51:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.411 141558 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 06 09:51:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.421 141558 INFO eventlet.wsgi.server [-] (141558) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Dec 06 09:51:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:54 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:54.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:54.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:54 compute-1 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 06 09:51:55 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:55.119 141446 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 06 09:51:55 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:55.120 141446 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpgd8t6f9_/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 06 09:51:55 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.937 141563 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 06 09:51:55 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.943 141563 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 06 09:51:55 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.947 141563 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 06 09:51:55 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:54.947 141563 INFO oslo.privsep.daemon [-] privsep daemon running as pid 141563
Dec 06 09:51:55 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:55.123 141563 DEBUG oslo.privsep.daemon [-] privsep: reply[2e4e6ae1-6b2a-4e54-a88e-3e4617c1dd17]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:51:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0018c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:55 compute-1 ceph-mon[79770]: pgmap v312: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:55 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:55.684 141563 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:51:55 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:55.684 141563 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:51:55 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:55.684 141563 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.295 141563 DEBUG oslo.privsep.daemon [-] privsep: reply[ac2e1070-6678-498d-bf82-5f3b3c28deac]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.298 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, column=external_ids, values=({'neutron:ovn-metadata-id': '43f55786-9e75-56e3-ac2c-7faf4144e8c1'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.311 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.319 141446 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.319 141446 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.320 141446 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.320 141446 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.320 141446 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.320 141446 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.320 141446 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.320 141446 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.320 141446 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.321 141446 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.322 141446 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.323 141446 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.323 141446 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.323 141446 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.323 141446 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.323 141446 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.323 141446 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.323 141446 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.324 141446 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.324 141446 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.324 141446 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.324 141446 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.324 141446 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.325 141446 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.325 141446 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.325 141446 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.325 141446 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.325 141446 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.325 141446 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.326 141446 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.327 141446 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.328 141446 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.329 141446 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.330 141446 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.330 141446 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.330 141446 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.330 141446 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.330 141446 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.330 141446 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.330 141446 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.331 141446 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.332 141446 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.333 141446 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.334 141446 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.334 141446 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.334 141446 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.334 141446 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.334 141446 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.334 141446 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.334 141446 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.335 141446 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.335 141446 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.335 141446 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.335 141446 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.335 141446 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.336 141446 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.336 141446 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.336 141446 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.336 141446 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.336 141446 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.336 141446 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.336 141446 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.337 141446 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.337 141446 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.337 141446 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.337 141446 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.337 141446 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.337 141446 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.338 141446 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.338 141446 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.338 141446 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.338 141446 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.338 141446 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.338 141446 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.338 141446 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.339 141446 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.339 141446 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.339 141446 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.339 141446 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.339 141446 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.339 141446 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.339 141446 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.340 141446 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.341 141446 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.342 141446 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.342 141446 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.342 141446 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.342 141446 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.342 141446 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.342 141446 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.342 141446 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.343 141446 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.343 141446 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.343 141446 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.343 141446 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.343 141446 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.343 141446 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.343 141446 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.344 141446 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.344 141446 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.344 141446 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.344 141446 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.344 141446 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.344 141446 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.345 141446 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.346 141446 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.346 141446 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.346 141446 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.346 141446 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.346 141446 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.346 141446 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.346 141446 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.347 141446 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.347 141446 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.347 141446 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.347 141446 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.347 141446 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.347 141446 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.347 141446 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.348 141446 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.349 141446 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.350 141446 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.351 141446 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.351 141446 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.351 141446 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.351 141446 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.351 141446 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.351 141446 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.351 141446 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.352 141446 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.353 141446 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.354 141446 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.354 141446 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.354 141446 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.354 141446 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.354 141446 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.354 141446 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.354 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.355 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.355 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.355 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.355 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.355 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.355 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.355 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.356 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.356 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.356 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.356 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.356 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.356 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.356 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.357 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.358 141446 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.359 141446 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.359 141446 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.359 141446 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 09:51:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:51:56.359 141446 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 09:51:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:56 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:56 compute-1 ceph-mon[79770]: pgmap v313: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:51:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:56.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:51:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:56.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:51:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c001a60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:51:58 compute-1 sshd-session[141569]: Accepted publickey for zuul from 192.168.122.30 port 40528 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:51:58 compute-1 systemd-logind[788]: New session 52 of user zuul.
Dec 06 09:51:58 compute-1 systemd[1]: Started Session 52 of User zuul.
Dec 06 09:51:58 compute-1 sshd-session[141569]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:51:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:58 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:51:58.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:51:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:51:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:51:58.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:51:59 compute-1 ceph-mon[79770]: pgmap v314: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:51:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:51:59 compute-1 python3.9[141723]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:51:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:51:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:00 compute-1 sudo[141879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llmkgeksqikovmilirkrtxqrioegcvrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014719.9741042-63-216714951255184/AnsiballZ_command.py'
Dec 06 09:52:00 compute-1 sudo[141879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:00 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:00 compute-1 python3.9[141881]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:52:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:52:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:00.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:52:00 compute-1 sudo[141879]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:00.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:01 compute-1 ceph-mon[79770]: pgmap v315: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:52:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:01 compute-1 sudo[142044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akmrgciohfuvjxgkqebqckhgknormzgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014721.1366842-96-214825417481550/AnsiballZ_systemd_service.py'
Dec 06 09:52:01 compute-1 sudo[142044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:02 compute-1 python3.9[142046]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:52:02 compute-1 systemd[1]: Reloading.
Dec 06 09:52:02 compute-1 systemd-sysv-generator[142072]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:52:02 compute-1 systemd-rc-local-generator[142068]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:52:02 compute-1 sudo[142044]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:02 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:02.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:02.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:02 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:52:03 compute-1 ceph-mon[79770]: pgmap v316: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:52:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:03 compute-1 python3.9[142232]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:52:03 compute-1 network[142249]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:52:03 compute-1 network[142250]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:52:03 compute-1 network[142251]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:52:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:04 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:04.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:52:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:04.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:52:05 compute-1 ceph-mon[79770]: pgmap v317: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 54 KiB/s rd, 0 B/s wr, 89 op/s
Dec 06 09:52:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:06 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:06.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:06.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:07 compute-1 sudo[142342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:52:07 compute-1 sudo[142342]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:52:07 compute-1 sudo[142342]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:07 compute-1 ceph-mon[79770]: pgmap v318: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 54 KiB/s rd, 0 B/s wr, 89 op/s
Dec 06 09:52:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:07 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:52:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:08 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:08 compute-1 ceph-mon[79770]: pgmap v319: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 96 KiB/s rd, 0 B/s wr, 159 op/s
Dec 06 09:52:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:08.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:08 compute-1 sudo[142539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpxkfojudagnjlnnmdodaepcehqbzirp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014728.487358-153-108038190043449/AnsiballZ_systemd_service.py'
Dec 06 09:52:08 compute-1 sudo[142539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:52:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:08.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:52:09 compute-1 python3.9[142541]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:52:09 compute-1 sudo[142539]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095209 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:52:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:52:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:09 compute-1 sudo[142692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skjggqbsdcyanjkzqnywryouxqzwsxiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014729.3512177-153-36462283253954/AnsiballZ_systemd_service.py'
Dec 06 09:52:09 compute-1 sudo[142692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:09 compute-1 python3.9[142694]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:52:10 compute-1 sudo[142692]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:10 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:10 compute-1 sudo[142846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-laovagcwnupursscahzgvwhuomwhtswv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014730.208084-153-113349496502283/AnsiballZ_systemd_service.py'
Dec 06 09:52:10 compute-1 sudo[142846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:10 compute-1 ceph-mon[79770]: pgmap v320: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 96 KiB/s rd, 0 B/s wr, 159 op/s
Dec 06 09:52:10 compute-1 sudo[142849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:52:10 compute-1 sudo[142849]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:52:10 compute-1 sudo[142849]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:10.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:10 compute-1 sudo[142874]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:52:10 compute-1 sudo[142874]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:52:10 compute-1 python3.9[142848]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:52:10 compute-1 sudo[142846]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:10.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:11 compute-1 sudo[142874]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:11 compute-1 sudo[143083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtqtvofgyhsiakfkeldsucgmnafzvxds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014731.049998-153-219804636627134/AnsiballZ_systemd_service.py'
Dec 06 09:52:11 compute-1 sudo[143083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:11 compute-1 python3.9[143085]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:52:11 compute-1 sudo[143083]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:12 compute-1 sudo[143237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoszzokcvhbbwqixqagpecinercfgczj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014731.8968325-153-158607886403837/AnsiballZ_systemd_service.py'
Dec 06 09:52:12 compute-1 sudo[143237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:12 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388002180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:12.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:12.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:52:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:13 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:52:13 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:52:13 compute-1 python3.9[143239]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:52:13 compute-1 sudo[143237]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358003490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:14 compute-1 sudo[143392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfaxtxeiprdjgllgbywrryoavtujgdqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014733.7987323-153-268231773626957/AnsiballZ_systemd_service.py'
Dec 06 09:52:14 compute-1 sudo[143392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:14 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358003490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:14 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:52:14 compute-1 ceph-mon[79770]: pgmap v321: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 96 KiB/s rd, 0 B/s wr, 159 op/s
Dec 06 09:52:14 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:52:14 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:52:14 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:52:14 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:52:14 compute-1 python3.9[143394]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:52:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.003000075s ======
Dec 06 09:52:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:14.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000075s
Dec 06 09:52:14 compute-1 sudo[143392]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:14.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:15 compute-1 sudo[143545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdqdwqobyswcjtsbcizxdxdtajqzcvtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014734.8669982-153-249721973230832/AnsiballZ_systemd_service.py'
Dec 06 09:52:15 compute-1 sudo[143545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388002180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:15 compute-1 python3.9[143547]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:52:15 compute-1 sudo[143545]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003cf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:15 compute-1 ceph-mon[79770]: pgmap v322: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 96 KiB/s rd, 0 B/s wr, 159 op/s
Dec 06 09:52:15 compute-1 podman[143573]: 2025-12-06 09:52:15.989982639 +0000 UTC m=+0.285705204 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 06 09:52:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:16 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358003490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:16 compute-1 sudo[143726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nosqpbxfdbbnjztqfqyzdqdiguymttnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014736.0964122-309-40001176415223/AnsiballZ_file.py'
Dec 06 09:52:16 compute-1 sudo[143726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:16.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:16 compute-1 python3.9[143728]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:16 compute-1 sudo[143726]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:16.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:17 compute-1 sudo[143878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jerqcbjefnqcchjnvmpjxwcysbmlfzcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014736.9373603-309-75820829497032/AnsiballZ_file.py'
Dec 06 09:52:17 compute-1 sudo[143878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:17 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:17 compute-1 python3.9[143880]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:17 compute-1 sudo[143878]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:17 compute-1 ceph-mon[79770]: pgmap v323: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 42 KiB/s rd, 0 B/s wr, 69 op/s
Dec 06 09:52:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:17 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:17 compute-1 sudo[144030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkjzofzqpupewjelaehvhdpkhctzfiar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014737.640364-309-1592511799138/AnsiballZ_file.py'
Dec 06 09:52:17 compute-1 sudo[144030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:18 compute-1 python3.9[144032]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:18 compute-1 sudo[144030]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:18 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:52:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:52:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:18 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:18.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:18 compute-1 sudo[144183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myyhcjlfqswiromxmlpyukufzdwpctxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014738.4409661-309-225631837769941/AnsiballZ_file.py'
Dec 06 09:52:18 compute-1 sudo[144183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:18.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:18 compute-1 python3.9[144185]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:18 compute-1 sudo[144183]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:19 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8358003490 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:19 compute-1 sudo[144335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhjnicnyghmwmirskephtmkwqghrxjxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014739.1567767-309-188984723088292/AnsiballZ_file.py'
Dec 06 09:52:19 compute-1 sudo[144335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:19 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388001380 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:19 compute-1 python3.9[144337]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:19 compute-1 sudo[144335]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:19 compute-1 ceph-mon[79770]: pgmap v324: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 42 KiB/s rd, 85 B/s wr, 70 op/s
Dec 06 09:52:20 compute-1 sudo[144487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbytssvhzdtwkfzvkenvrivftxsnztzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014739.8508844-309-214265534036260/AnsiballZ_file.py'
Dec 06 09:52:20 compute-1 sudo[144487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:20 compute-1 python3.9[144489]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:20 compute-1 sudo[144487]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:20 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:20.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:20 compute-1 sudo[144640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otczpgjvhiyklfnbbqedattqwauoybdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014740.5837815-309-75507596685385/AnsiballZ_file.py'
Dec 06 09:52:20 compute-1 sudo[144640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:20.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:21 compute-1 python3.9[144642]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:21 compute-1 sudo[144640]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:52:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:52:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:21 compute-1 ceph-mon[79770]: pgmap v325: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:52:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388001380 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:22 compute-1 sudo[144801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbwxtyvrmwyvtnpuvhewuzswpvkzyrgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014741.9211736-459-253349327205215/AnsiballZ_file.py'
Dec 06 09:52:22 compute-1 sudo[144801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:22 compute-1 podman[144766]: 2025-12-06 09:52:22.240349389 +0000 UTC m=+0.066446878 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:52:22 compute-1 python3.9[144809]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:22 compute-1 sudo[144801]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:22 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388001380 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:22 compute-1 ceph-mon[79770]: pgmap v326: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:52:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:22.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:22.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:23 compute-1 sudo[144966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evacvdrdiwmoqhvpjquyuejrionqaevv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014742.5876179-459-204098568378282/AnsiballZ_file.py'
Dec 06 09:52:23 compute-1 sudo[144966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:23 compute-1 python3.9[144968]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:23 compute-1 sudo[144966]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:52:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388001380 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:23 compute-1 sudo[145118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdyluwlmorapgbsqptevtsfjyejhghgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014743.371013-459-187696529053810/AnsiballZ_file.py'
Dec 06 09:52:23 compute-1 sudo[145118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003d50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:23 compute-1 python3.9[145120]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:23 compute-1 sudo[145118]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:23 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:52:24 compute-1 sudo[145271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upilrylkvgycalboquwddjhffqooukgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014743.9931817-459-83878917595213/AnsiballZ_file.py'
Dec 06 09:52:24 compute-1 sudo[145271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:24 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:52:24 compute-1 python3.9[145273]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:24 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:24 compute-1 sudo[145271]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:24.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:24 compute-1 sudo[145373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:52:24 compute-1 sudo[145373]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:52:24 compute-1 sudo[145373]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:24 compute-1 sudo[145448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eskgyvhknkjmkrizyodoszcmhcbkhhrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014744.6040754-459-199442588030944/AnsiballZ_file.py'
Dec 06 09:52:24 compute-1 sudo[145448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:24.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:24 compute-1 ceph-mon[79770]: pgmap v327: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:52:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:52:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:52:25 compute-1 python3.9[145450]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:25 compute-1 sudo[145448]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:25 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:25 compute-1 sudo[145600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svmsqxcghtekpyfndfljzlicxeskdbry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014745.305539-459-166695679829063/AnsiballZ_file.py'
Dec 06 09:52:25 compute-1 sudo[145600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:25 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:25 compute-1 python3.9[145602]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:25 compute-1 sudo[145600]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:26 compute-1 sudo[145753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgiwdckwbbxalrxffkvabmigkxhnmzgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014745.9228072-459-42913687784162/AnsiballZ_file.py'
Dec 06 09:52:26 compute-1 sudo[145753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:26 compute-1 python3.9[145755]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:52:26 compute-1 sudo[145753]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:26 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003d70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:26.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:52:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:26.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:52:27 compute-1 ceph-mon[79770]: pgmap v328: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:52:27 compute-1 sudo[145905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgszltnhxxlxmpopicavqnpjdwzxaojo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014746.8968813-612-24941227160610/AnsiballZ_command.py'
Dec 06 09:52:27 compute-1 sudo[145905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:27 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:27 compute-1 python3.9[145907]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:52:27 compute-1 sudo[145905]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:27 compute-1 sudo[145931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:52:27 compute-1 sudo[145931]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:52:27 compute-1 sudo[145931]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:27 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:28 compute-1 python3.9[146084]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:52:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:52:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:28 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8388009990 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:28.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:28.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:28 compute-1 sudo[146235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcukwlmrfzjmsqitryfbwsmfhrakwwur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014748.7020543-666-58321647849839/AnsiballZ_systemd_service.py'
Dec 06 09:52:28 compute-1 sudo[146235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:29 compute-1 ceph-mon[79770]: pgmap v329: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 09:52:29 compute-1 python3.9[146237]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:52:29 compute-1 systemd[1]: Reloading.
Dec 06 09:52:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:29 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003d90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095229 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:52:29 compute-1 systemd-rc-local-generator[146266]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:52:29 compute-1 systemd-sysv-generator[146269]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:52:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:29 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:29 compute-1 sudo[146235]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:30 compute-1 sudo[146423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dflotoqdfrkctmyanfjypqtuhdxppldy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014749.9430268-690-118473226290060/AnsiballZ_command.py'
Dec 06 09:52:30 compute-1 sudo[146423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:30 compute-1 python3.9[146425]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:52:30 compute-1 sudo[146423]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:30 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004000 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:30.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:30 compute-1 sudo[146577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoxpqmtfmnlcymcpgysxqxyiioydncxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014750.6074252-690-100090846986978/AnsiballZ_command.py'
Dec 06 09:52:30 compute-1 sudo[146577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:30.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:31 compute-1 python3.9[146579]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:52:31 compute-1 sudo[146577]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:31 compute-1 ceph-mon[79770]: pgmap v330: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:52:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:31 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:31 compute-1 sudo[146730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgbmiyqkzxktdlzekwnefvyffrjukcqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014751.2387288-690-189127420213542/AnsiballZ_command.py'
Dec 06 09:52:31 compute-1 sudo[146730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:31 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:31 compute-1 python3.9[146732]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:52:31 compute-1 sudo[146730]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:32 compute-1 sudo[146883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuqmtvywaouwkyfdefzxgkxesewtkdqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014751.8587794-690-270016625501398/AnsiballZ_command.py'
Dec 06 09:52:32 compute-1 sudo[146883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:32 compute-1 python3.9[146885]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:52:32 compute-1 sudo[146883]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:32 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:32 compute-1 sudo[147037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltoehlimlqtbxtvadjnylcshafwbpdih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014752.4590628-690-33669137215831/AnsiballZ_command.py'
Dec 06 09:52:32 compute-1 sudo[147037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:32.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:32 compute-1 python3.9[147039]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:52:32 compute-1 sudo[147037]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:32.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:33 compute-1 sudo[147190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvrnrajiuhkvxdjttykjzbbxjvrlilva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014753.0507746-690-34908357035939/AnsiballZ_command.py'
Dec 06 09:52:33 compute-1 sudo[147190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:33 compute-1 ceph-mon[79770]: pgmap v331: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:52:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:52:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:33 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004020 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:33 compute-1 python3.9[147192]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:52:33 compute-1 sudo[147190]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:33 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:33 compute-1 sudo[147343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtzkszopwkdatblbkyvsyvqcqedfzleg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014753.6901789-690-236477697548879/AnsiballZ_command.py'
Dec 06 09:52:33 compute-1 sudo[147343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:34 compute-1 python3.9[147345]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:52:34 compute-1 sudo[147343]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:34 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:34.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:34.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:35 compute-1 sudo[147497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reojscdlfcsqioijxmafjternxxmbccs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014754.7770338-852-146975824474942/AnsiballZ_getent.py'
Dec 06 09:52:35 compute-1 sudo[147497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:35 compute-1 ceph-mon[79770]: pgmap v332: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:52:35 compute-1 python3.9[147499]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 06 09:52:35 compute-1 sudo[147497]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:35 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:35 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004040 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:36 compute-1 sudo[147650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eicgrumjbmfbvthlvwagswzuwlfbsfrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014755.5932977-876-112804447789237/AnsiballZ_group.py'
Dec 06 09:52:36 compute-1 sudo[147650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:36 compute-1 python3.9[147652]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 06 09:52:36 compute-1 groupadd[147654]: group added to /etc/group: name=libvirt, GID=42473
Dec 06 09:52:36 compute-1 groupadd[147654]: group added to /etc/gshadow: name=libvirt
Dec 06 09:52:36 compute-1 groupadd[147654]: new group: name=libvirt, GID=42473
Dec 06 09:52:36 compute-1 sudo[147650]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:36 compute-1 ceph-mon[79770]: pgmap v333: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:52:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:36.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:36.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:37 compute-1 sudo[147809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyspeapkxrkvlqvtfaxacelbiwhchctv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014757.023096-900-210865704259412/AnsiballZ_user.py'
Dec 06 09:52:37 compute-1 sudo[147809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:37 compute-1 python3.9[147811]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 06 09:52:37 compute-1 useradd[147813]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Dec 06 09:52:37 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:52:37 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:52:38 compute-1 sudo[147809]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:52:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:38 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:38 compute-1 ceph-mon[79770]: pgmap v334: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:52:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:38.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:38.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:39 compute-1 sudo[147971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzgirnesrcnvuatenkcpxjmnnkggycik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014759.5684154-933-269435803207623/AnsiballZ_setup.py'
Dec 06 09:52:39 compute-1 sudo[147971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:52:40 compute-1 python3.9[147973]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:52:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:40 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:40 compute-1 sudo[147971]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:40.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:40 compute-1 sudo[148056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-essixnwjpfavidgoirqaavjvhontxyry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014759.5684154-933-269435803207623/AnsiballZ_dnf.py'
Dec 06 09:52:40 compute-1 sudo[148056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:52:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:40.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:41 compute-1 python3.9[148058]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:52:41 compute-1 ceph-mon[79770]: pgmap v335: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:52:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:42 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:42.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:42.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:52:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001080 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:44 compute-1 ceph-mon[79770]: pgmap v336: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:52:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:44 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:44.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:44.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:45 compute-1 ceph-mon[79770]: pgmap v337: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:52:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:46 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:46.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:46 compute-1 podman[148073]: 2025-12-06 09:52:46.866274581 +0000 UTC m=+0.153602119 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 06 09:52:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:46.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c001080 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:47 compute-1 ceph-mon[79770]: pgmap v338: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:52:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:47 compute-1 sudo[148099]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:52:47 compute-1 sudo[148099]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:52:47 compute-1 sudo[148099]: pam_unix(sudo:session): session closed for user root
Dec 06 09:52:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:52:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:48 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:48 compute-1 ceph-mon[79770]: pgmap v339: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:52:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:48.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:48.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c002730 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:50 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:50.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:50 compute-1 ceph-mon[79770]: pgmap v340: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:52:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:50.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:52 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:52 compute-1 podman[148245]: 2025-12-06 09:52:52.747101651 +0000 UTC m=+0.054547298 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 09:52:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:52.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:52.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:53 compute-1 ceph-mon[79770]: pgmap v341: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:52:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:52:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:52:54.260 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:52:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:52:54.263 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:52:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:52:54.263 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:52:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:52:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:54 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:54.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:54.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004060 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:55 compute-1 ceph-mon[79770]: pgmap v342: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:52:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:56 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:56.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:52:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:56.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:52:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:57 compute-1 ceph-mon[79770]: pgmap v343: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:52:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:52:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:58 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004200 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:52:58.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:52:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:52:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:52:58.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:52:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:52:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:52:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c003050 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:00 compute-1 ceph-mon[79770]: pgmap v344: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:53:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:00 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:00.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:00.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8360004220 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:01 compute-1 ceph-mon[79770]: pgmap v345: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:53:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:02 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c0043a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:02.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:02.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:53:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:03 compute-1 ceph-mon[79770]: pgmap v346: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:53:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:04 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.002000051s ======
Dec 06 09:53:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:04.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000051s
Dec 06 09:53:04 compute-1 ceph-mon[79770]: pgmap v347: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:53:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:04.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c0043a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:06 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:06.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:07.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:07 compute-1 ceph-mon[79770]: pgmap v348: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:53:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c0043a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:07 compute-1 sudo[148332]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:53:07 compute-1 sudo[148332]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:53:07 compute-1 sudo[148332]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:53:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:08 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:08.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:09.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:09 compute-1 ceph-mon[79770]: pgmap v349: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:53:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:53:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:10 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c0043a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:10 compute-1 ceph-mon[79770]: pgmap v350: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:53:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:53:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:10.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:53:10 compute-1 kernel: SELinux:  Converting 2773 SID table entries...
Dec 06 09:53:10 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:53:10 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 06 09:53:10 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:53:10 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:53:10 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:53:10 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:53:10 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:53:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:53:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:11.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:53:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364002690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:12 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:53:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:12.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:53:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:13.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:13 compute-1 ceph-mon[79770]: pgmap v351: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:53:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:53:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c0043a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:14 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:14.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:15.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:15 compute-1 ceph-mon[79770]: pgmap v352: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:53:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:16 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:16.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:17.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:17 compute-1 ceph-mon[79770]: pgmap v353: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:53:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:17 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:17 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Dec 06 09:53:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:17 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:17 compute-1 podman[148371]: 2025-12-06 09:53:17.854505158 +0000 UTC m=+0.139767064 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:53:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:53:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:18 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:18.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:53:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:19.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:53:19 compute-1 ceph-mon[79770]: pgmap v354: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:53:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:19 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:19 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:20 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:20.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:21.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:21 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:22 compute-1 ceph-mon[79770]: pgmap v355: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:53:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:22 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:22.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:23.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:23 compute-1 kernel: SELinux:  Converting 2773 SID table entries...
Dec 06 09:53:23 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:53:23 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 06 09:53:23 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:53:23 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:53:23 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:53:23 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:53:23 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:53:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:53:23 compute-1 ceph-mon[79770]: pgmap v356: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:53:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:23 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 06 09:53:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:23 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:23 compute-1 podman[148408]: 2025-12-06 09:53:23.775344381 +0000 UTC m=+0.065008354 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:53:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:24 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:53:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:24.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:24 compute-1 sudo[148430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:53:25 compute-1 sudo[148430]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:53:25 compute-1 sudo[148430]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:25.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:25 compute-1 sudo[148455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:53:25 compute-1 sudo[148455]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:53:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:25 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:25 compute-1 sudo[148455]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:25 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f838800a2b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:25 compute-1 ceph-mon[79770]: pgmap v357: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:53:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:26 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:53:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:53:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:53:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:53:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:53:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:53:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:53:26 compute-1 ceph-mon[79770]: pgmap v358: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:53:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:26.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:53:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:27.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:53:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:27 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364003430 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095327 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:53:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:27 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:27 compute-1 sudo[148512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:53:27 compute-1 sudo[148512]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:53:27 compute-1 sudo[148512]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:53:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:28 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:28.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:29.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:29 compute-1 ceph-mon[79770]: pgmap v359: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 09:53:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:29 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:29 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354000b60 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:30 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:30 compute-1 sudo[148541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:53:30 compute-1 sudo[148541]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:53:30 compute-1 sudo[148541]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:30.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:31.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:31 compute-1 ceph-mon[79770]: pgmap v360: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:53:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:53:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:53:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:31 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c002ad0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:31 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:32 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:32.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:33.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:33 compute-1 ceph-mon[79770]: pgmap v361: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:53:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:53:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:33 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:33 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:34 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:34.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:35.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:35 compute-1 ceph-mon[79770]: pgmap v362: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:53:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:35 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:35 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:53:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:36 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:36.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:37.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:37 compute-1 ceph-mon[79770]: pgmap v363: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:53:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:37 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:53:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:38 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83540016a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:38.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:39.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:39 compute-1 ceph-mon[79770]: pgmap v364: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 06 09:53:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:53:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:53:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:53:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:39 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:40 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:40.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:41.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:41 compute-1 ceph-mon[79770]: pgmap v365: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Dec 06 09:53:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:41 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:42 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:53:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:42 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:42.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:53:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:43.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:53:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:53:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:43 compute-1 ceph-mon[79770]: pgmap v366: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Dec 06 09:53:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:43 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:44 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354002b10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:44.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:45.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:45 compute-1 ceph-mon[79770]: pgmap v367: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 09:53:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:45 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:46 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:46 compute-1 ceph-mon[79770]: pgmap v368: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 09:53:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:46.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:47.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095347 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:53:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:47 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:48 compute-1 sudo[155370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:53:48 compute-1 sudo[155370]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:53:48 compute-1 sudo[155370]: pam_unix(sudo:session): session closed for user root
Dec 06 09:53:48 compute-1 podman[155431]: 2025-12-06 09:53:48.160889725 +0000 UTC m=+0.119172911 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Dec 06 09:53:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:53:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:48 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:48.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:49.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:49 compute-1 ceph-mon[79770]: pgmap v369: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 09:53:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:49 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:50 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:50.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:51.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:51 compute-1 ceph-mon[79770]: pgmap v370: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 09:53:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:51 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:52 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:52.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:53:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:53.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:53:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:53:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:53 compute-1 ceph-mon[79770]: pgmap v371: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 09:53:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:53 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:53:54.261 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:53:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:53:54.262 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:53:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:53:54.262 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:53:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:54 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:53:54 compute-1 ceph-mon[79770]: pgmap v372: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 09:53:54 compute-1 podman[159219]: 2025-12-06 09:53:54.774899091 +0000 UTC m=+0.067146588 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 09:53:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:53:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:54.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:53:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:53:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:55.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:53:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:55 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:56 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:56.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:53:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:57.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:53:57 compute-1 ceph-mon[79770]: pgmap v373: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 06 09:53:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:57 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:53:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:58 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f837c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:53:58.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:53:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:53:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:53:59.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:53:59 compute-1 ceph-mon[79770]: pgmap v374: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Dec 06 09:53:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:53:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:53:59 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:00 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:00.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:01.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364001070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:01 compute-1 ceph-mon[79770]: pgmap v375: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:54:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:01 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:02 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:02.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:03.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:54:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:03 compute-1 ceph-mon[79770]: pgmap v376: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:54:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:03 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364001070 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:04 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:04 compute-1 ceph-mon[79770]: pgmap v377: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 09:54:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:04.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:05.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:05 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:06 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:06 compute-1 ceph-mon[79770]: pgmap v378: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:54:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:06.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:07.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:07 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:08 compute-1 sudo[165464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:54:08 compute-1 sudo[165464]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:54:08 compute-1 sudo[165464]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:54:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:08 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:08.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:09.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:09 compute-1 ceph-mon[79770]: pgmap v379: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 09:54:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:54:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:09 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:10 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:54:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:10.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:54:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:11.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:11 compute-1 ceph-mon[79770]: pgmap v380: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:54:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:11 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:12 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:12 compute-1 ceph-mon[79770]: pgmap v381: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:54:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:12.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:13.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:54:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:13 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f83580041e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:14 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8364001fc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:14.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:54:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:15.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:54:15 compute-1 ceph-mon[79770]: pgmap v382: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 09:54:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f836c0040b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[130681]: 06/12/2025 09:54:15 : epoch 6933fc5f : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8354003c30 fd 48 proxy ignored for local
Dec 06 09:54:15 compute-1 kernel: ganesha.nfsd[162192]: segfault at 50 ip 00007f8434c9b32e sp 00007f83e9ffa210 error 4 in libntirpc.so.5.8[7f8434c80000+2c000] likely on CPU 6 (core 0, socket 6)
Dec 06 09:54:15 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 06 09:54:15 compute-1 systemd[1]: Started Process Core Dump (PID 165495/UID 0).
Dec 06 09:54:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:16.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:17.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:54:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:18.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:19.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:19 compute-1 ceph-mon[79770]: pgmap v383: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:54:19 compute-1 systemd-coredump[165496]: Process 130685 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 67:
                                                    #0  0x00007f8434c9b32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 06 09:54:19 compute-1 ceph-mon[79770]: pgmap v384: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 09:54:19 compute-1 podman[165499]: 2025-12-06 09:54:19.928122694 +0000 UTC m=+1.226087114 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:54:19 compute-1 systemd[1]: systemd-coredump@5-165495-0.service: Deactivated successfully.
Dec 06 09:54:19 compute-1 systemd[1]: systemd-coredump@5-165495-0.service: Consumed 1.492s CPU time.
Dec 06 09:54:20 compute-1 podman[165530]: 2025-12-06 09:54:20.035802592 +0000 UTC m=+0.031149655 container died aeb4a191b30e3d0e639fe714012cb8167b13d0245e7a274e7aa6d996a80dbf01 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Dec 06 09:54:20 compute-1 systemd[1]: var-lib-containers-storage-overlay-40002f69ba8adcfb87c67e5821ee91e412b0c3574a69c61d93dc56a081e3f1b8-merged.mount: Deactivated successfully.
Dec 06 09:54:20 compute-1 podman[165530]: 2025-12-06 09:54:20.177507138 +0000 UTC m=+0.172854171 container remove aeb4a191b30e3d0e639fe714012cb8167b13d0245e7a274e7aa6d996a80dbf01 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Dec 06 09:54:20 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec 06 09:54:20 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec 06 09:54:20 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.123s CPU time.
Dec 06 09:54:20 compute-1 ceph-mon[79770]: pgmap v385: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:54:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:20.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:54:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:21.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:54:21 compute-1 kernel: SELinux:  Converting 2774 SID table entries...
Dec 06 09:54:21 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Dec 06 09:54:21 compute-1 kernel: SELinux:  policy capability open_perms=1
Dec 06 09:54:21 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Dec 06 09:54:21 compute-1 kernel: SELinux:  policy capability always_check_network=0
Dec 06 09:54:21 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 06 09:54:21 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 06 09:54:21 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 06 09:54:22 compute-1 groupadd[165588]: group added to /etc/group: name=dnsmasq, GID=992
Dec 06 09:54:22 compute-1 groupadd[165588]: group added to /etc/gshadow: name=dnsmasq
Dec 06 09:54:22 compute-1 groupadd[165588]: new group: name=dnsmasq, GID=992
Dec 06 09:54:22 compute-1 useradd[165596]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Dec 06 09:54:22 compute-1 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Dec 06 09:54:22 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 06 09:54:22 compute-1 dbus-broker-launch[770]: Noticed file-system modification, trigger reload.
Dec 06 09:54:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:22.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:23.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:23 compute-1 ceph-mon[79770]: pgmap v386: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:54:23 compute-1 groupadd[165609]: group added to /etc/group: name=clevis, GID=991
Dec 06 09:54:23 compute-1 groupadd[165609]: group added to /etc/gshadow: name=clevis
Dec 06 09:54:23 compute-1 groupadd[165609]: new group: name=clevis, GID=991
Dec 06 09:54:23 compute-1 useradd[165616]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Dec 06 09:54:23 compute-1 usermod[165626]: add 'clevis' to group 'tss'
Dec 06 09:54:23 compute-1 usermod[165626]: add 'clevis' to shadow group 'tss'
Dec 06 09:54:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:54:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:54:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:24.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:25.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095425 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:54:25 compute-1 ceph-mon[79770]: pgmap v387: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 09:54:25 compute-1 podman[165648]: 2025-12-06 09:54:25.767345949 +0000 UTC m=+0.067901083 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 09:54:25 compute-1 polkitd[43440]: Reloading rules
Dec 06 09:54:25 compute-1 polkitd[43440]: Collecting garbage unconditionally...
Dec 06 09:54:25 compute-1 polkitd[43440]: Loading rules from directory /etc/polkit-1/rules.d
Dec 06 09:54:25 compute-1 polkitd[43440]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 06 09:54:25 compute-1 polkitd[43440]: Finished loading, compiling and executing 3 rules
Dec 06 09:54:25 compute-1 polkitd[43440]: Reloading rules
Dec 06 09:54:25 compute-1 polkitd[43440]: Collecting garbage unconditionally...
Dec 06 09:54:25 compute-1 polkitd[43440]: Loading rules from directory /etc/polkit-1/rules.d
Dec 06 09:54:25 compute-1 polkitd[43440]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 06 09:54:25 compute-1 polkitd[43440]: Finished loading, compiling and executing 3 rules
Dec 06 09:54:26 compute-1 ceph-mon[79770]: pgmap v388: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:54:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:26.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:27.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:27 compute-1 groupadd[165835]: group added to /etc/group: name=ceph, GID=167
Dec 06 09:54:27 compute-1 groupadd[165835]: group added to /etc/gshadow: name=ceph
Dec 06 09:54:27 compute-1 groupadd[165835]: new group: name=ceph, GID=167
Dec 06 09:54:28 compute-1 useradd[165841]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Dec 06 09:54:28 compute-1 sudo[165848]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:54:28 compute-1 sudo[165848]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:54:28 compute-1 sudo[165848]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:54:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:28.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:29.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:29 compute-1 ceph-mon[79770]: pgmap v389: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:54:29 compute-1 sshd-session[165874]: Received disconnect from 103.14.32.75 port 46632:11:  [preauth]
Dec 06 09:54:29 compute-1 sshd-session[165874]: Disconnected from authenticating user root 103.14.32.75 port 46632 [preauth]
Dec 06 09:54:30 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 6.
Dec 06 09:54:30 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:54:30 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.123s CPU time.
Dec 06 09:54:30 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:54:30 compute-1 podman[166348]: 2025-12-06 09:54:30.728618311 +0000 UTC m=+0.055959744 container create 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 09:54:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/674595a2f8d871ddef4522155fda703c933fe31e7b86dbc4d96e00021066cf79/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 06 09:54:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/674595a2f8d871ddef4522155fda703c933fe31e7b86dbc4d96e00021066cf79/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:54:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/674595a2f8d871ddef4522155fda703c933fe31e7b86dbc4d96e00021066cf79/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:54:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/674595a2f8d871ddef4522155fda703c933fe31e7b86dbc4d96e00021066cf79/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:54:30 compute-1 podman[166348]: 2025-12-06 09:54:30.698824953 +0000 UTC m=+0.026166406 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:54:30 compute-1 podman[166348]: 2025-12-06 09:54:30.800610969 +0000 UTC m=+0.127952432 container init 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 09:54:30 compute-1 podman[166348]: 2025-12-06 09:54:30.806526021 +0000 UTC m=+0.133867454 container start 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Dec 06 09:54:30 compute-1 bash[166348]: 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8
Dec 06 09:54:30 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:54:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 06 09:54:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 06 09:54:30 compute-1 sudo[166449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:54:30 compute-1 sudo[166449]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:54:30 compute-1 sudo[166449]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 06 09:54:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 06 09:54:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 06 09:54:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 06 09:54:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 06 09:54:30 compute-1 sudo[166549]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 06 09:54:30 compute-1 sudo[166549]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:54:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:54:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:30.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:31.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:31 compute-1 systemd[1]: Stopping OpenSSH server daemon...
Dec 06 09:54:31 compute-1 sshd[1008]: Received signal 15; terminating.
Dec 06 09:54:31 compute-1 systemd[1]: sshd.service: Deactivated successfully.
Dec 06 09:54:31 compute-1 systemd[1]: Stopped OpenSSH server daemon.
Dec 06 09:54:31 compute-1 systemd[1]: sshd.service: Consumed 3.137s CPU time, read 564.0K from disk, written 0B to disk.
Dec 06 09:54:31 compute-1 systemd[1]: Stopped target sshd-keygen.target.
Dec 06 09:54:31 compute-1 systemd[1]: Stopping sshd-keygen.target...
Dec 06 09:54:31 compute-1 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:54:31 compute-1 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:54:31 compute-1 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 06 09:54:31 compute-1 systemd[1]: Reached target sshd-keygen.target.
Dec 06 09:54:31 compute-1 ceph-mon[79770]: pgmap v390: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:54:31 compute-1 systemd[1]: Starting OpenSSH server daemon...
Dec 06 09:54:31 compute-1 sshd[166671]: Server listening on 0.0.0.0 port 22.
Dec 06 09:54:31 compute-1 sshd[166671]: Server listening on :: port 22.
Dec 06 09:54:31 compute-1 systemd[1]: Started OpenSSH server daemon.
Dec 06 09:54:31 compute-1 podman[166761]: 2025-12-06 09:54:31.551201935 +0000 UTC m=+0.070690725 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 06 09:54:31 compute-1 podman[166761]: 2025-12-06 09:54:31.678793646 +0000 UTC m=+0.198282396 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:54:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:32.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:33.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:54:33 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 06 09:54:33 compute-1 podman[166956]: 2025-12-06 09:54:33.490672304 +0000 UTC m=+1.283561218 container exec 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:54:33 compute-1 podman[166956]: 2025-12-06 09:54:33.504569303 +0000 UTC m=+1.297458207 container exec_died 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 09:54:33 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:54:33 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 06 09:54:33 compute-1 podman[167138]: 2025-12-06 09:54:33.912668281 +0000 UTC m=+0.086099433 container exec 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Dec 06 09:54:33 compute-1 podman[167138]: 2025-12-06 09:54:33.926773305 +0000 UTC m=+0.100204437 container exec_died 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 09:54:34 compute-1 systemd[1]: Reloading.
Dec 06 09:54:34 compute-1 systemd-rc-local-generator[167252]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:54:34 compute-1 systemd-sysv-generator[167256]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:54:34 compute-1 podman[167271]: 2025-12-06 09:54:34.21651274 +0000 UTC m=+0.069505164 container exec 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec 06 09:54:34 compute-1 podman[167271]: 2025-12-06 09:54:34.228809137 +0000 UTC m=+0.081801561 container exec_died 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec 06 09:54:34 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 09:54:34 compute-1 ceph-mon[79770]: pgmap v391: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:54:34 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:54:34 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:54:34 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec 06 09:54:34 compute-1 podman[167584]: 2025-12-06 09:54:34.957284703 +0000 UTC m=+0.517641487 container exec c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, distribution-scope=public, name=keepalived, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=keepalived for Ceph, architecture=x86_64)
Dec 06 09:54:34 compute-1 podman[167584]: 2025-12-06 09:54:34.974785324 +0000 UTC m=+0.535142108 container exec_died c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, description=keepalived for Ceph, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, io.openshift.expose-services=, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2023-02-22T09:23:20)
Dec 06 09:54:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:34.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:35 compute-1 sudo[166549]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:35.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:35 compute-1 sudo[168164]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:54:35 compute-1 sudo[168164]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:54:35 compute-1 sudo[168164]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:35 compute-1 sudo[168269]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:54:35 compute-1 sudo[168269]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:54:35 compute-1 sudo[168269]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:35 compute-1 sudo[148056]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:35 compute-1 ceph-mon[79770]: pgmap v392: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:54:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:54:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:54:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec 06 09:54:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:54:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:54:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:54:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:54:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:54:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:54:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:54:36 compute-1 ceph-mon[79770]: pgmap v393: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:54:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:54:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:54:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:36.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:37.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:54:38 compute-1 ceph-mon[79770]: pgmap v394: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:54:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:38.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:39.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:54:40 compute-1 ceph-mon[79770]: pgmap v395: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:54:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:54:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:40.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:54:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:41.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:42 compute-1 sudo[174491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:54:42 compute-1 sudo[174491]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:54:42 compute-1 sudo[174491]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:42 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:54:42 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:54:42 compute-1 ceph-mon[79770]: pgmap v396: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:54:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:42.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 09:54:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:43.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140013b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:43 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 09:54:43 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 06 09:54:43 compute-1 systemd[1]: man-db-cache-update.service: Consumed 12.685s CPU time.
Dec 06 09:54:43 compute-1 systemd[1]: run-r155ba3ec05ea4cb393ee9881f7853740.service: Deactivated successfully.
Dec 06 09:54:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:44 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:45.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:45.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:45 compute-1 ceph-mon[79770]: pgmap v397: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 09:54:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095445 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:54:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002250 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:46 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:47.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:47.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:47 compute-1 ceph-mon[79770]: pgmap v398: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:54:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:48 compute-1 sudo[175890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:54:48 compute-1 sudo[175890]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:54:48 compute-1 sudo[175890]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:54:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:48 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:49.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:49.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:49 compute-1 ceph-mon[79770]: pgmap v399: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:54:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:50 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:50 compute-1 podman[175916]: 2025-12-06 09:54:50.834439672 +0000 UTC m=+0.134962704 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:54:50 compute-1 ceph-mon[79770]: pgmap v400: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:54:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:51.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:51.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:52 compute-1 sudo[176070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iworvnekwxtlrtwtobrktuhdjeifzuhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014891.2080717-970-188676279374623/AnsiballZ_systemd.py'
Dec 06 09:54:52 compute-1 sudo[176070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:52 compute-1 python3.9[176073]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:54:52 compute-1 systemd[1]: Reloading.
Dec 06 09:54:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:52 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f40016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:52 compute-1 systemd-rc-local-generator[176101]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:54:52 compute-1 systemd-sysv-generator[176104]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:54:52 compute-1 sudo[176070]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:53.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:53 compute-1 ceph-mon[79770]: pgmap v401: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:54:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:54:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:53.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:54:53 compute-1 sudo[176261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efzgltrhwnoqvcrdjezlhtdpkjvvyxyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014893.0588782-970-201014305067693/AnsiballZ_systemd.py'
Dec 06 09:54:53 compute-1 sudo[176261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:54:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:53 compute-1 python3.9[176263]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:54:53 compute-1 systemd[1]: Reloading.
Dec 06 09:54:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:53 compute-1 systemd-rc-local-generator[176289]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:54:53 compute-1 systemd-sysv-generator[176293]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:54:54 compute-1 sudo[176261]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:54:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:54:54.263 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:54:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:54:54.265 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:54:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:54:54.266 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:54:54 compute-1 sudo[176452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvgokhdrtmgwaereixzdsfnhavswtbju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014894.2401125-970-61442696107585/AnsiballZ_systemd.py'
Dec 06 09:54:54 compute-1 sudo[176452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:54 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:54 compute-1 python3.9[176454]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:54:54 compute-1 systemd[1]: Reloading.
Dec 06 09:54:55 compute-1 systemd-rc-local-generator[176485]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:54:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:55.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:55 compute-1 systemd-sysv-generator[176488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:54:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:54:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:55.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:54:55 compute-1 ceph-mon[79770]: pgmap v402: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:54:55 compute-1 sudo[176452]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:55 compute-1 sudo[176642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kutczkfdmhoszxqyvzvucsllnqsvizpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014895.4219244-970-201033101828161/AnsiballZ_systemd.py'
Dec 06 09:54:55 compute-1 sudo[176642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:56 compute-1 python3.9[176644]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:54:56 compute-1 systemd[1]: Reloading.
Dec 06 09:54:56 compute-1 podman[176646]: 2025-12-06 09:54:56.114100249 +0000 UTC m=+0.060075521 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 06 09:54:56 compute-1 systemd-sysv-generator[176696]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:54:56 compute-1 systemd-rc-local-generator[176693]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:54:56 compute-1 sudo[176642]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:56 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:57.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:54:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:57.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:54:57 compute-1 sudo[176851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bimvlpacsmqyjcwpqnnezwsftlwsdzvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014896.8850136-1056-218802847887668/AnsiballZ_systemd.py'
Dec 06 09:54:57 compute-1 sudo[176851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:57 compute-1 ceph-mon[79770]: pgmap v403: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 06 09:54:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:57 compute-1 python3.9[176853]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:54:57 compute-1 systemd[1]: Reloading.
Dec 06 09:54:57 compute-1 systemd-rc-local-generator[176885]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:54:57 compute-1 systemd-sysv-generator[176888]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:54:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:57 compute-1 sudo[176851]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:58 compute-1 sudo[177042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwskzybmlpohvrukayfmmcjksdacnfmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014898.1164944-1056-100684180634228/AnsiballZ_systemd.py'
Dec 06 09:54:58 compute-1 sudo[177042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:54:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:58 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:58 compute-1 python3.9[177044]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:54:58 compute-1 systemd[1]: Reloading.
Dec 06 09:54:58 compute-1 systemd-sysv-generator[177076]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:54:58 compute-1 systemd-rc-local-generator[177067]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:54:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:54:59.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:59 compute-1 sudo[177042]: pam_unix(sudo:session): session closed for user root
Dec 06 09:54:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:54:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:54:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:54:59.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:54:59 compute-1 ceph-mon[79770]: pgmap v404: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Dec 06 09:54:59 compute-1 sudo[177232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwtpggbvrdqwbpbaqswbcmlvbvcalezq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014899.2647388-1056-259040023832139/AnsiballZ_systemd.py'
Dec 06 09:54:59 compute-1 sudo[177232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:54:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:54:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:54:59 compute-1 python3.9[177234]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:54:59 compute-1 systemd[1]: Reloading.
Dec 06 09:55:00 compute-1 systemd-sysv-generator[177269]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:55:00 compute-1 systemd-rc-local-generator[177264]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:55:00 compute-1 sudo[177232]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:00 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:00 compute-1 sudo[177423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eogoessliaxjqhlpvplzitoxxhxvtaru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014900.404173-1056-98648735449120/AnsiballZ_systemd.py'
Dec 06 09:55:00 compute-1 sudo[177423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:01.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:01 compute-1 python3.9[177425]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:01.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:01 compute-1 sudo[177423]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:01 compute-1 ceph-mon[79770]: pgmap v405: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:55:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:01 compute-1 sudo[177578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vivubsiirwwoxylalqnrivhrogcpmmob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014901.4080875-1056-203165766265776/AnsiballZ_systemd.py'
Dec 06 09:55:01 compute-1 sudo[177578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:02 compute-1 python3.9[177580]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:02 compute-1 systemd[1]: Reloading.
Dec 06 09:55:02 compute-1 systemd-rc-local-generator[177611]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:55:02 compute-1 systemd-sysv-generator[177614]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:55:02 compute-1 sudo[177578]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:02 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:02 compute-1 ceph-mon[79770]: pgmap v406: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:55:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:55:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:03.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:55:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:03.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:55:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:04 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095504 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:55:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:05.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:05 compute-1 ceph-mon[79770]: pgmap v407: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 09:55:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:55:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:05.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:55:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:05 compute-1 sudo[177770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jciacyttazenykcnvzoqziebnxfkqmoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014905.3591976-1164-227028202750800/AnsiballZ_systemd.py'
Dec 06 09:55:05 compute-1 sudo[177770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:05 compute-1 python3.9[177772]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 06 09:55:06 compute-1 systemd[1]: Reloading.
Dec 06 09:55:06 compute-1 systemd-rc-local-generator[177799]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:55:06 compute-1 systemd-sysv-generator[177803]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:55:06 compute-1 systemd[1]: Listening on libvirt proxy daemon socket.
Dec 06 09:55:06 compute-1 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec 06 09:55:06 compute-1 sudo[177770]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:06 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:07.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:07 compute-1 sudo[177963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eflcelcitbfuoiudehcibbbufcryskyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014906.7740457-1188-41437358917250/AnsiballZ_systemd.py'
Dec 06 09:55:07 compute-1 sudo[177963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:07.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:07 compute-1 ceph-mon[79770]: pgmap v408: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:55:07 compute-1 python3.9[177965]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:07 compute-1 sudo[177963]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:07 compute-1 sudo[178118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oifeyzseytecseknhwphbmzpyrppehph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014907.6395037-1188-17498071989197/AnsiballZ_systemd.py'
Dec 06 09:55:07 compute-1 sudo[178118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:08 compute-1 python3.9[178120]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:08 compute-1 sudo[178118]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:08 compute-1 sudo[178138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:55:08 compute-1 sudo[178138]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:55:08 compute-1 sudo[178138]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:55:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:08 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:08 compute-1 sudo[178299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zldjbuskfsxnxicnnhljpnwkgxykepvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014908.4624615-1188-225300395526208/AnsiballZ_systemd.py'
Dec 06 09:55:08 compute-1 sudo[178299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:09 compute-1 python3.9[178301]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:09.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:09 compute-1 sudo[178299]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:55:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:09.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:55:09 compute-1 ceph-mon[79770]: pgmap v409: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:55:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:55:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:09 compute-1 sudo[178454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjgoxokcozaxfipyfvurldumkfmnhoay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014909.2720456-1188-215903599917613/AnsiballZ_systemd.py'
Dec 06 09:55:09 compute-1 sudo[178454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:09 compute-1 python3.9[178456]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:09 compute-1 sudo[178454]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:10 compute-1 sudo[178610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncobjvwxcedzhuigszlwkbkhsclotmqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014910.116769-1188-187354990733029/AnsiballZ_systemd.py'
Dec 06 09:55:10 compute-1 sudo[178610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:10 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:10 compute-1 python3.9[178612]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:11.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:11.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:11 compute-1 ceph-mon[79770]: pgmap v410: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:55:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:11 compute-1 sudo[178610]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:12 compute-1 sudo[178766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyzeogdphrcbikhnirezcgcyrsmvcexk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014911.9516385-1188-126161033084275/AnsiballZ_systemd.py'
Dec 06 09:55:12 compute-1 sudo[178766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:12 compute-1 python3.9[178768]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:12 compute-1 sudo[178766]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:12 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:55:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:13.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:55:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:13.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:13 compute-1 sudo[178921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkfkqgvvrmhknzqwlxwolfqzugeqjxon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014912.7836964-1188-246223137377851/AnsiballZ_systemd.py'
Dec 06 09:55:13 compute-1 sudo[178921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:13 compute-1 ceph-mon[79770]: pgmap v411: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:55:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:55:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:13 compute-1 python3.9[178923]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:13 compute-1 sudo[178921]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214003520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:14 compute-1 sudo[179076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxdahipmppolvqoxkolnzcnatodbsntd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014913.8476233-1188-139541616926953/AnsiballZ_systemd.py'
Dec 06 09:55:14 compute-1 sudo[179076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:14 compute-1 python3.9[179078]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:14 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:55:14 compute-1 sudo[179076]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:14 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:14 compute-1 sudo[179235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onbtghukuvpylblmwnsrtatfhkimghdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014914.6958268-1188-246578738540727/AnsiballZ_systemd.py'
Dec 06 09:55:14 compute-1 sudo[179235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:15.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:15.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:15 compute-1 python3.9[179237]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:15 compute-1 ceph-mon[79770]: pgmap v412: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:55:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218001d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:16 compute-1 sudo[179235]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:16 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:16 compute-1 sudo[179391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvvupezqkywcgpnuzwlqvtnbwpgnhqil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014916.5176287-1188-55227427314644/AnsiballZ_systemd.py'
Dec 06 09:55:16 compute-1 sudo[179391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:17.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:17 compute-1 python3.9[179393]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:17 compute-1 sudo[179391]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:17.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:55:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:55:17 compute-1 ceph-mon[79770]: pgmap v413: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:55:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:17 compute-1 sudo[179546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xryqjbieiwtkuwwtcrtejmkknabgfgwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014917.3620036-1188-121575548644050/AnsiballZ_systemd.py'
Dec 06 09:55:17 compute-1 sudo[179546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:18 compute-1 python3.9[179548]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:18 compute-1 sudo[179546]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:55:18 compute-1 sudo[179702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lybuxygsenyjpdjgxpgrteejrxqwciws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014918.2311175-1188-5643303134270/AnsiballZ_systemd.py'
Dec 06 09:55:18 compute-1 sudo[179702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:18 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:18 compute-1 python3.9[179704]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:18 compute-1 sudo[179702]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:19.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:55:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:19.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:55:19 compute-1 sudo[179857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixtidcmgfjfmjowzkyfaeuadcevdnmrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014919.055324-1188-6476227837581/AnsiballZ_systemd.py'
Dec 06 09:55:19 compute-1 sudo[179857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:19 compute-1 ceph-mon[79770]: pgmap v414: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:55:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:19 compute-1 python3.9[179859]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:19 compute-1 sudo[179857]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:20 compute-1 sudo[180012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcpwbomaactxihdcrhzjcdqovoqhlmlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014919.8974776-1188-60258194610864/AnsiballZ_systemd.py'
Dec 06 09:55:20 compute-1 sudo[180012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:20 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:55:20 compute-1 python3.9[180015]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 06 09:55:20 compute-1 sudo[180012]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:20 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:55:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:21.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:55:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:55:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:21.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:55:21 compute-1 ceph-mon[79770]: pgmap v415: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:55:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:21 compute-1 podman[180043]: 2025-12-06 09:55:21.835770009 +0000 UTC m=+0.132579191 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 09:55:22 compute-1 ceph-mon[79770]: pgmap v416: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:55:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:22 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:23.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:55:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:23.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:55:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:55:23 compute-1 sudo[180195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woixmztmanvbjkfpofkiscxkycjghapa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014923.1987548-1494-214057497157385/AnsiballZ_file.py'
Dec 06 09:55:23 compute-1 sudo[180195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:23 compute-1 python3.9[180197]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:55:23 compute-1 sudo[180195]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:24 compute-1 sudo[180347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkllrodvfxsdugdvxinmmuytoetbvpiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014923.9030995-1494-259925647523720/AnsiballZ_file.py'
Dec 06 09:55:24 compute-1 sudo[180347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:55:24 compute-1 python3.9[180349]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:55:24 compute-1 sudo[180347]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:24 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:24 compute-1 sudo[180500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yblymixjjylmlskifbdcthrwypvhssog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014924.5144403-1494-148864352651007/AnsiballZ_file.py'
Dec 06 09:55:24 compute-1 sudo[180500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:24 compute-1 python3.9[180502]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:55:24 compute-1 sudo[180500]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:25.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:25 compute-1 ceph-mon[79770]: pgmap v417: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 09:55:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:25.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:25 compute-1 sudo[180652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvgpumtzfoduzilxqrbcpculczfvpurj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014925.1151416-1494-91563345693896/AnsiballZ_file.py'
Dec 06 09:55:25 compute-1 sudo[180652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:25 compute-1 python3.9[180654]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:55:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:25 compute-1 sudo[180652]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:26 compute-1 sudo[180804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niptepslhcnajfpclhrpftowytyukfem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014925.7502394-1494-242926971131078/AnsiballZ_file.py'
Dec 06 09:55:26 compute-1 sudo[180804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:26 compute-1 python3.9[180806]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:55:26 compute-1 sudo[180804]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:26 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:26 compute-1 sudo[180968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdhhaqrjjpgwdpvewkmwmveakdhetxed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014926.4310179-1494-93938889076798/AnsiballZ_file.py'
Dec 06 09:55:26 compute-1 sudo[180968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:26 compute-1 podman[180931]: 2025-12-06 09:55:26.744938289 +0000 UTC m=+0.066273241 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 09:55:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095526 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:55:26 compute-1 python3.9[180974]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:55:26 compute-1 sudo[180968]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:27.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:27 compute-1 ceph-mon[79770]: pgmap v418: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Dec 06 09:55:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:27.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:28 compute-1 sudo[181128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-papxpnztsiblreqqpefkpmvpvwtonyjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014927.6592553-1623-269269377540596/AnsiballZ_stat.py'
Dec 06 09:55:28 compute-1 sudo[181128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:28 compute-1 python3.9[181130]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:28 compute-1 sudo[181128]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:55:28 compute-1 sudo[181157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:55:28 compute-1 sudo[181157]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:55:28 compute-1 sudo[181157]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:28 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002d50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:28 compute-1 sudo[181279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhywnwghdqpbnclqixmfiliybfbzatzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014927.6592553-1623-269269377540596/AnsiballZ_copy.py'
Dec 06 09:55:28 compute-1 sudo[181279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:29 compute-1 python3.9[181281]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014927.6592553-1623-269269377540596/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:55:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:29.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:55:29 compute-1 sudo[181279]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:29.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:29 compute-1 ceph-mon[79770]: pgmap v419: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:55:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:29 compute-1 sudo[181431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwhrhhfrltzrrnukgydnydxanixfcxto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014929.2874403-1623-88322411598992/AnsiballZ_stat.py'
Dec 06 09:55:29 compute-1 sudo[181431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:29 compute-1 python3.9[181433]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:29 compute-1 sudo[181431]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:30 compute-1 sudo[181557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsomjmdnncqhvecszafzvowtfzzqxavh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014929.2874403-1623-88322411598992/AnsiballZ_copy.py'
Dec 06 09:55:30 compute-1 sudo[181557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:30 compute-1 python3.9[181559]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014929.2874403-1623-88322411598992/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:30 compute-1 sudo[181557]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:31.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:55:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:31.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:55:31 compute-1 sudo[181709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqiomrihdnbicluchxrnauagcqduwrsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014930.9420683-1623-256027691618372/AnsiballZ_stat.py'
Dec 06 09:55:31 compute-1 sudo[181709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:31 compute-1 ceph-mon[79770]: pgmap v420: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:55:31 compute-1 python3.9[181711]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:31 compute-1 sudo[181709]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:31 compute-1 sudo[181834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvxjapzwweznzhiuknqmprpksdpllfsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014930.9420683-1623-256027691618372/AnsiballZ_copy.py'
Dec 06 09:55:31 compute-1 sudo[181834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:32 compute-1 python3.9[181836]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014930.9420683-1623-256027691618372/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:32 compute-1 sudo[181834]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:32 compute-1 sudo[181987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgcmnwbxwyookvyzpxsjidcitfpzchdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014932.260272-1623-129593959920977/AnsiballZ_stat.py'
Dec 06 09:55:32 compute-1 sudo[181987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:32 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:32 compute-1 python3.9[181989]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:32 compute-1 sudo[181987]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:55:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:33.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:55:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:33.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:33 compute-1 sudo[182112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-louotklddngovnofzokjjneuzvroeyya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014932.260272-1623-129593959920977/AnsiballZ_copy.py'
Dec 06 09:55:33 compute-1 sudo[182112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:33 compute-1 ceph-mon[79770]: pgmap v421: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:55:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:55:33 compute-1 python3.9[182114]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014932.260272-1623-129593959920977/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:33 compute-1 sudo[182112]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:34 compute-1 sudo[182264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smeuqlivmubkvfxoypcuejgonisyjxtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014933.7534971-1623-115910026622844/AnsiballZ_stat.py'
Dec 06 09:55:34 compute-1 sudo[182264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:34 compute-1 python3.9[182266]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:34 compute-1 sudo[182264]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:34 compute-1 sudo[182390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgzqpdiqqarmpqxklkaicbndwuruhxzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014933.7534971-1623-115910026622844/AnsiballZ_copy.py'
Dec 06 09:55:34 compute-1 sudo[182390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:34 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:34 compute-1 python3.9[182392]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014933.7534971-1623-115910026622844/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:34 compute-1 sudo[182390]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:35.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:35.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:35 compute-1 sudo[182542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzvexhomahiuagaibmgxzxfhnasdgseu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014934.9705372-1623-270543221199491/AnsiballZ_stat.py'
Dec 06 09:55:35 compute-1 sudo[182542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:35 compute-1 ceph-mon[79770]: pgmap v422: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:55:35 compute-1 python3.9[182544]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:35 compute-1 sudo[182542]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:35 compute-1 sudo[182667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzfnfvsbrzciemovxrczpxnguczrbnog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014934.9705372-1623-270543221199491/AnsiballZ_copy.py'
Dec 06 09:55:35 compute-1 sudo[182667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:36 compute-1 python3.9[182669]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014934.9705372-1623-270543221199491/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:36 compute-1 sudo[182667]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:36 compute-1 sudo[182820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptqnktbilprzufmvsuqzfzudygwhgxja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014936.2861311-1623-256080245427790/AnsiballZ_stat.py'
Dec 06 09:55:36 compute-1 sudo[182820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:36 compute-1 python3.9[182822]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:36 compute-1 sudo[182820]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:55:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:37.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:55:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:37.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:37 compute-1 sudo[182943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhpdojoxcoayzakighshhvfadkklxiit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014936.2861311-1623-256080245427790/AnsiballZ_copy.py'
Dec 06 09:55:37 compute-1 sudo[182943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:37 compute-1 python3.9[182945]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014936.2861311-1623-256080245427790/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:37 compute-1 ceph-mon[79770]: pgmap v423: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 06 09:55:37 compute-1 sudo[182943]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:37 compute-1 sudo[183095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suhkmsvkxpnyfuwgovcxwpusvsddmlrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014937.6553319-1623-227135029814572/AnsiballZ_stat.py'
Dec 06 09:55:37 compute-1 sudo[183095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:38 compute-1 python3.9[183097]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:38 compute-1 sudo[183095]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:55:38 compute-1 sudo[183221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftbvpccrrkgbsfgtqyxixuyrvadvntqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014937.6553319-1623-227135029814572/AnsiballZ_copy.py'
Dec 06 09:55:38 compute-1 sudo[183221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:38 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:38 compute-1 python3.9[183223]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765014937.6553319-1623-227135029814572/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:38 compute-1 sudo[183221]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:39.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:39.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:39 compute-1 ceph-mon[79770]: pgmap v424: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Dec 06 09:55:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:55:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:40 compute-1 sudo[183374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtsazptlmakrmdazaounxroxphjyknrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014939.996982-1962-67452279962238/AnsiballZ_command.py'
Dec 06 09:55:40 compute-1 sudo[183374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:40 compute-1 python3.9[183376]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec 06 09:55:40 compute-1 sudo[183374]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:40 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:41.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:41 compute-1 sudo[183527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owitfugycricgmgxrphzsgaooayomohc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014940.891947-1989-266916289253181/AnsiballZ_file.py'
Dec 06 09:55:41 compute-1 sudo[183527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:55:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:41.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:55:41 compute-1 python3.9[183529]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:41 compute-1 sudo[183527]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:41 compute-1 ceph-mon[79770]: pgmap v425: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:55:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:41 compute-1 sudo[183679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpptmfodwcgdffpypuhbubegkdlsnzty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014941.6008358-1989-259489946636085/AnsiballZ_file.py'
Dec 06 09:55:41 compute-1 sudo[183679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:42 compute-1 python3.9[183681]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:42 compute-1 sudo[183679]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:42 compute-1 sudo[183707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:55:42 compute-1 sudo[183707]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:55:42 compute-1 sudo[183707]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:42 compute-1 sudo[183767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:55:42 compute-1 sudo[183767]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:55:42 compute-1 sudo[183884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hledvfvzesntdflfjuyieikydpvwomnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014942.324287-1989-13812846097615/AnsiballZ_file.py'
Dec 06 09:55:42 compute-1 sudo[183884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:42 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:42 compute-1 python3.9[183891]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:42 compute-1 sudo[183884]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:42 compute-1 sudo[183767]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:55:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:43.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:55:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:43.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:43 compute-1 sudo[184064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blzmqvudskkcicrasfibspukpocbjpel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014943.0127263-1989-206954645469113/AnsiballZ_file.py'
Dec 06 09:55:43 compute-1 sudo[184064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:55:43 compute-1 python3.9[184066]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:43 compute-1 sudo[184064]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:43 compute-1 ceph-mon[79770]: pgmap v426: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:55:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:43 compute-1 sudo[184216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jznzpwshalazeqewjworwraixwwycsxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014943.68874-1989-76258395470734/AnsiballZ_file.py'
Dec 06 09:55:43 compute-1 sudo[184216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:44 compute-1 python3.9[184218]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:44 compute-1 sudo[184216]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:44 compute-1 ceph-mon[79770]: pgmap v427: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 09:55:44 compute-1 sudo[184369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtacvpykvmxrzfnujbeldmjuspbxzgbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014944.2889483-1989-86659871527358/AnsiballZ_file.py'
Dec 06 09:55:44 compute-1 sudo[184369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:44 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:44 compute-1 python3.9[184371]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:44 compute-1 sudo[184369]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:55:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:45.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:55:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:45.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:45 compute-1 sudo[184521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsjcqrmpdwxxjzdwdqceaomksjghcjxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014944.9826875-1989-193098166744390/AnsiballZ_file.py'
Dec 06 09:55:45 compute-1 sudo[184521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:45 compute-1 python3.9[184523]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:45 compute-1 sudo[184521]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218009a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:45 compute-1 sudo[184675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvvyhpdhnvdapbzyqallpqrpbafruote ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014945.6604743-1989-267440849028830/AnsiballZ_file.py'
Dec 06 09:55:45 compute-1 sudo[184675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:46 compute-1 python3.9[184677]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:46 compute-1 sudo[184675]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:46 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:46 compute-1 sudo[184828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlmxfhiwxbgfblwwoapoattjkbzoviwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014946.3954558-1989-98783963550869/AnsiballZ_file.py'
Dec 06 09:55:46 compute-1 sudo[184828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:46 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:55:46 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:55:46 compute-1 ceph-mon[79770]: pgmap v428: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:55:46 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:55:46 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:55:46 compute-1 python3.9[184830]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:46 compute-1 sudo[184828]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:47.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:47.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:47 compute-1 sudo[184980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdaqajpfjxnfigcrgvyqmpzgiddacoko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014947.286481-1989-225064458568342/AnsiballZ_file.py'
Dec 06 09:55:47 compute-1 sudo[184980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:55:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:55:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:55:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:55:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:55:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:47 compute-1 python3.9[184982]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:47 compute-1 sudo[184980]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:55:48 compute-1 sudo[185133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yoodmlqmicanpdgdnwyxvqphmdfavmnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014948.2272687-1989-211771510906825/AnsiballZ_file.py'
Dec 06 09:55:48 compute-1 sudo[185133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:48 compute-1 sudo[185135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:55:48 compute-1 sudo[185135]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:55:48 compute-1 sudo[185135]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:48 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:48 compute-1 python3.9[185136]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:48 compute-1 ceph-mon[79770]: pgmap v429: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 09:55:48 compute-1 sudo[185133]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:55:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:49.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:55:49 compute-1 sudo[185310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymususnnzoppalwlhbhmbobpaeiajtui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014948.9177425-1989-278285707597868/AnsiballZ_file.py'
Dec 06 09:55:49 compute-1 sudo[185310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:49.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:49 compute-1 python3.9[185312]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:49 compute-1 sudo[185310]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:49 compute-1 sudo[185462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jktsykigpjwgpwtjwovxbheibjnoekdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014949.5575392-1989-29872752639120/AnsiballZ_file.py'
Dec 06 09:55:49 compute-1 sudo[185462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:50 compute-1 python3.9[185464]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:50 compute-1 sudo[185462]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:50 compute-1 sudo[185615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqeardgighhllnyakkwqrzhwztrwviyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014950.2271929-1989-196134056211094/AnsiballZ_file.py'
Dec 06 09:55:50 compute-1 sudo[185615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:50 compute-1 python3.9[185617]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:50 compute-1 sudo[185615]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:50 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:55:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:51.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:55:51 compute-1 ceph-mon[79770]: pgmap v430: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:55:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:55:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:51.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:55:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:51 compute-1 sudo[185767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofbtraswydrazwcmtjuawewwljmvckeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014951.4663227-2286-127800449495628/AnsiballZ_stat.py'
Dec 06 09:55:51 compute-1 sudo[185767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140029b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:51 compute-1 python3.9[185769]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:51 compute-1 sudo[185767]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:52 compute-1 sudo[185770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:55:52 compute-1 sudo[185770]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:55:52 compute-1 sudo[185770]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:52 compute-1 podman[185815]: 2025-12-06 09:55:52.149266552 +0000 UTC m=+0.088141295 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Dec 06 09:55:52 compute-1 sudo[185942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdbypxqmeuxpcppbrcdoudugtvecqrys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014951.4663227-2286-127800449495628/AnsiballZ_copy.py'
Dec 06 09:55:52 compute-1 sudo[185942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:52 compute-1 python3.9[185944]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014951.4663227-2286-127800449495628/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:52 compute-1 sudo[185942]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:52 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:52 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:55:52 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:55:52 compute-1 ceph-mon[79770]: pgmap v431: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:55:52 compute-1 sudo[186094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiufbdwdtalarijsppjrtdtmmvhrnfah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014952.6533813-2286-135091598438569/AnsiballZ_stat.py'
Dec 06 09:55:52 compute-1 sudo[186094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:53 compute-1 python3.9[186096]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:53 compute-1 sudo[186094]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:53.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:55:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:53.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:55:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:55:53 compute-1 sudo[186217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaanhkbjejinwdqfnrynjargbeafukjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014952.6533813-2286-135091598438569/AnsiballZ_copy.py'
Dec 06 09:55:53 compute-1 sudo[186217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:53 compute-1 python3.9[186219]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014952.6533813-2286-135091598438569/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:53 compute-1 sudo[186217]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:55:54 compute-1 sudo[186369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbktgfptrrlhicssyyswmltrychxtcnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014953.86701-2286-218499335610294/AnsiballZ_stat.py'
Dec 06 09:55:54 compute-1 sudo[186369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:55:54.264 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:55:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:55:54.265 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:55:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:55:54.265 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:55:54 compute-1 python3.9[186371]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:54 compute-1 sudo[186369]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:54 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140029b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:54 compute-1 sudo[186493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyglguasgcugymbpmoxgakealsrvkzjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014953.86701-2286-218499335610294/AnsiballZ_copy.py'
Dec 06 09:55:54 compute-1 sudo[186493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:54 compute-1 python3.9[186495]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014953.86701-2286-218499335610294/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:54 compute-1 ceph-mon[79770]: pgmap v432: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 09:55:54 compute-1 sudo[186493]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:55.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:55:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:55.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:55:55 compute-1 sudo[186645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ergmsyohwffzhmnnxslmempezcwgnfhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014955.134994-2286-132600693924501/AnsiballZ_stat.py'
Dec 06 09:55:55 compute-1 sudo[186645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:55 compute-1 python3.9[186647]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:55 compute-1 sudo[186645]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:56 compute-1 sudo[186768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isrxqglhqkzavltdqdakghsnidqprsnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014955.134994-2286-132600693924501/AnsiballZ_copy.py'
Dec 06 09:55:56 compute-1 sudo[186768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:56 compute-1 python3.9[186770]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014955.134994-2286-132600693924501/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:56 compute-1 sudo[186768]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:56 compute-1 sudo[186921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzfdcqdnyqhkqvvrwgmmjikahfljgowy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014956.3551629-2286-241862508862875/AnsiballZ_stat.py'
Dec 06 09:55:56 compute-1 sudo[186921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:56 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:56 compute-1 python3.9[186923]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:56 compute-1 sudo[186921]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:55:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:57.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:55:57 compute-1 sudo[187057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iskdplzqpavqvxdsvymozchspfstrfku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014956.3551629-2286-241862508862875/AnsiballZ_copy.py'
Dec 06 09:55:57 compute-1 sudo[187057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:57 compute-1 podman[187018]: 2025-12-06 09:55:57.196214895 +0000 UTC m=+0.055871763 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 09:55:57 compute-1 ceph-mon[79770]: pgmap v433: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:55:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:57.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:57 compute-1 python3.9[187065]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014956.3551629-2286-241862508862875/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:57 compute-1 sudo[187057]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140029b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:57 compute-1 sudo[187215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xegelcixylosnsjkbuxoqcyctfkebzcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014957.5776188-2286-159478002584401/AnsiballZ_stat.py'
Dec 06 09:55:57 compute-1 sudo[187215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:58 compute-1 python3.9[187217]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:58 compute-1 sudo[187215]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:58 compute-1 sudo[187339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plqmpanhoqbbejrwmvfhaklzwrvvsmaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014957.5776188-2286-159478002584401/AnsiballZ_copy.py'
Dec 06 09:55:58 compute-1 sudo[187339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:55:58 compute-1 python3.9[187341]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014957.5776188-2286-159478002584401/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:58 compute-1 sudo[187339]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:58 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:59 compute-1 sudo[187491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anselcrfepznqzijgykatmyctybkgbpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014958.7831938-2286-58637370012528/AnsiballZ_stat.py'
Dec 06 09:55:59 compute-1 sudo[187491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:55:59.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:59 compute-1 ceph-mon[79770]: pgmap v434: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 09:55:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:55:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:55:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:55:59.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:55:59 compute-1 python3.9[187493]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:55:59 compute-1 sudo[187491]: pam_unix(sudo:session): session closed for user root
Dec 06 09:55:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:59 compute-1 sudo[187614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfkjnbgyreadsqjyuvsnvzinmmezuqvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014958.7831938-2286-58637370012528/AnsiballZ_copy.py'
Dec 06 09:55:59 compute-1 sudo[187614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:55:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:55:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:55:59 compute-1 python3.9[187616]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014958.7831938-2286-58637370012528/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:55:59 compute-1 sudo[187614]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:00 compute-1 sudo[187767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leytkykvcbuyrmgzydqmgqtndoxehiee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014960.0902197-2286-99531107673205/AnsiballZ_stat.py'
Dec 06 09:56:00 compute-1 sudo[187767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:00 compute-1 python3.9[187769]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:00 compute-1 sudo[187767]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:00 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:00 compute-1 sudo[187890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eauvgojveuizwwvaaxvjmaatjyzvzehd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014960.0902197-2286-99531107673205/AnsiballZ_copy.py'
Dec 06 09:56:00 compute-1 sudo[187890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:01 compute-1 python3.9[187892]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014960.0902197-2286-99531107673205/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:01.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:01 compute-1 sudo[187890]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:01 compute-1 ceph-mon[79770]: pgmap v435: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:56:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:01.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:01 compute-1 sudo[188042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pregsciezzggrvtnggjofvxnmeavvbox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014961.3119662-2286-208247083521326/AnsiballZ_stat.py'
Dec 06 09:56:01 compute-1 sudo[188042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:01 compute-1 python3.9[188044]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:01 compute-1 sudo[188042]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:02 compute-1 sudo[188165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggsiuxibtolortaaqfjskqltalgmxzzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014961.3119662-2286-208247083521326/AnsiballZ_copy.py'
Dec 06 09:56:02 compute-1 sudo[188165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:02 compute-1 python3.9[188167]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014961.3119662-2286-208247083521326/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:02 compute-1 sudo[188165]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:02 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:02 compute-1 sudo[188318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxpgpjknqdxwujyeokpzpdhngbkqcpbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014962.4706473-2286-15615144229666/AnsiballZ_stat.py'
Dec 06 09:56:02 compute-1 sudo[188318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:02 compute-1 python3.9[188320]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:02 compute-1 sudo[188318]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:56:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:03.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:56:03 compute-1 ceph-mon[79770]: pgmap v436: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:56:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:03.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:03 compute-1 sudo[188441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liucahcukvkewtkrpkvhaoctigarjces ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014962.4706473-2286-15615144229666/AnsiballZ_copy.py'
Dec 06 09:56:03 compute-1 sudo[188441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:03 compute-1 python3.9[188443]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014962.4706473-2286-15615144229666/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:56:03 compute-1 sudo[188441]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e80032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:03 compute-1 sudo[188593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vitzxirtwbrhqyuirrfpxitjaegytmnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014963.6607192-2286-251782830922038/AnsiballZ_stat.py'
Dec 06 09:56:03 compute-1 sudo[188593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:04 compute-1 python3.9[188595]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:04 compute-1 sudo[188593]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:04 compute-1 sudo[188717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adecrbcmopcqxqgclfykkjzhxpltqigx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014963.6607192-2286-251782830922038/AnsiballZ_copy.py'
Dec 06 09:56:04 compute-1 sudo[188717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:04 compute-1 python3.9[188719]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014963.6607192-2286-251782830922038/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:04 compute-1 sudo[188717]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:04 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:05 compute-1 sudo[188869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leqzfyjvtertbbffgskdtrjedjikxczz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014964.8524275-2286-52076285670208/AnsiballZ_stat.py'
Dec 06 09:56:05 compute-1 sudo[188869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:05.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:05 compute-1 ceph-mon[79770]: pgmap v437: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 09:56:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:05.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:05 compute-1 python3.9[188871]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:05 compute-1 sudo[188869]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:05 compute-1 sudo[188992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggvwoekhrfqtlprfagibsnuxuzjntcti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014964.8524275-2286-52076285670208/AnsiballZ_copy.py'
Dec 06 09:56:05 compute-1 sudo[188992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e80032f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:05 compute-1 python3.9[188994]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014964.8524275-2286-52076285670208/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:05 compute-1 sudo[188992]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:06 compute-1 sudo[189145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcptkbtbyglsmhbtbqvjijefsgmxylmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014966.1406472-2286-32012174891742/AnsiballZ_stat.py'
Dec 06 09:56:06 compute-1 sudo[189145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:06 compute-1 python3.9[189147]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:06 compute-1 sudo[189145]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:06 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:07 compute-1 sudo[189268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmsknffojfowzkvtdmjcdccidogzicxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014966.1406472-2286-32012174891742/AnsiballZ_copy.py'
Dec 06 09:56:07 compute-1 sudo[189268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:07.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:07 compute-1 python3.9[189270]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014966.1406472-2286-32012174891742/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:07 compute-1 sudo[189268]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:07 compute-1 ceph-mon[79770]: pgmap v438: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:56:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:07.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095607 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:56:07 compute-1 sudo[189420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxbehobftyumgnprtetllqadqkypajhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014967.4085715-2286-132152143828031/AnsiballZ_stat.py'
Dec 06 09:56:07 compute-1 sudo[189420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:07 compute-1 python3.9[189422]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:07 compute-1 sudo[189420]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:08 compute-1 sudo[189544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qaoszfexkmqiwbrldcyxbawpafcgprrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014967.4085715-2286-132152143828031/AnsiballZ_copy.py'
Dec 06 09:56:08 compute-1 sudo[189544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:56:08 compute-1 python3.9[189546]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765014967.4085715-2286-132152143828031/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:08 compute-1 sudo[189544]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:08 compute-1 sudo[189552]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:56:08 compute-1 sudo[189552]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:56:08 compute-1 sudo[189552]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:08 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:09.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:09.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:09 compute-1 ceph-mon[79770]: pgmap v439: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 09:56:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:56:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.002398) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970002640, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4355, "num_deletes": 501, "total_data_size": 11825558, "memory_usage": 11978824, "flush_reason": "Manual Compaction"}
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970040683, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 4439644, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13251, "largest_seqno": 17601, "table_properties": {"data_size": 4428209, "index_size": 6457, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3909, "raw_key_size": 30991, "raw_average_key_size": 19, "raw_value_size": 4401079, "raw_average_value_size": 2824, "num_data_blocks": 282, "num_entries": 1558, "num_filter_entries": 1558, "num_deletions": 501, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014557, "oldest_key_time": 1765014557, "file_creation_time": 1765014970, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 38271 microseconds, and 12276 cpu microseconds.
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.040761) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 4439644 bytes OK
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.040805) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.042867) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.042894) EVENT_LOG_v1 {"time_micros": 1765014970042888, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.042916) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 11806248, prev total WAL file size 11806248, number of live WAL files 2.
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.046103) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323530' seq:72057594037927935, type:22 .. '6D67727374617400353031' seq:0, type:0; will stop at (end)
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(4335KB)], [27(13MB)]
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970046268, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 18464773, "oldest_snapshot_seqno": -1}
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 5031 keys, 13936321 bytes, temperature: kUnknown
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970178300, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 13936321, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13900701, "index_size": 21942, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12613, "raw_key_size": 125913, "raw_average_key_size": 25, "raw_value_size": 13807514, "raw_average_value_size": 2744, "num_data_blocks": 917, "num_entries": 5031, "num_filter_entries": 5031, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765014970, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.178807) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 13936321 bytes
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.179962) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 139.6 rd, 105.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.2, 13.4 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(7.3) write-amplify(3.1) OK, records in: 5852, records dropped: 821 output_compression: NoCompression
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.179979) EVENT_LOG_v1 {"time_micros": 1765014970179971, "job": 14, "event": "compaction_finished", "compaction_time_micros": 132243, "compaction_time_cpu_micros": 48198, "output_level": 6, "num_output_files": 1, "total_output_size": 13936321, "num_input_records": 5852, "num_output_records": 5031, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970181120, "job": 14, "event": "table_file_deletion", "file_number": 29}
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765014970184266, "job": 14, "event": "table_file_deletion", "file_number": 27}
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.045943) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.184439) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.184447) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.184449) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.184451) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:56:10 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:10.184453) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:56:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:10 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:11 compute-1 ceph-mon[79770]: pgmap v440: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:56:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:11.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:11.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:11 compute-1 python3.9[189722]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:56:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:12 compute-1 sudo[189876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpstfnmcupvkskurkkmvhfcfvvxustzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014972.0351636-2904-93824448009526/AnsiballZ_seboolean.py'
Dec 06 09:56:12 compute-1 sudo[189876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:12 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:12 compute-1 python3.9[189878]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 06 09:56:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:13.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:13 compute-1 ceph-mon[79770]: pgmap v441: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:56:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:13.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:56:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:14 compute-1 sudo[189876]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:14 compute-1 sudo[190034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lauqzglnasvffalnkozogfryghiqoobn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014974.2622101-2928-5673418770469/AnsiballZ_copy.py'
Dec 06 09:56:14 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec 06 09:56:14 compute-1 sudo[190034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:14 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:14 compute-1 python3.9[190036]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:14 compute-1 sudo[190034]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:15.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:15 compute-1 ceph-mon[79770]: pgmap v442: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 09:56:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:56:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:15.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:56:15 compute-1 sudo[190186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uprktuvyuusrdssplrvlgnhxmrdiagrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014974.975325-2928-122914211884385/AnsiballZ_copy.py'
Dec 06 09:56:15 compute-1 sudo[190186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:15 compute-1 python3.9[190188]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:15 compute-1 sudo[190186]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:56:16 compute-1 sudo[190338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sajardguonsfnxsiygtyidenemwvrhoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014975.7111237-2928-76201275065017/AnsiballZ_copy.py'
Dec 06 09:56:16 compute-1 sudo[190338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:16 compute-1 python3.9[190340]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:16 compute-1 sudo[190338]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:16 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:16 compute-1 sudo[190491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sektfqbpizcofkntjnajhaetynudgtbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014976.5112438-2928-59706055538349/AnsiballZ_copy.py'
Dec 06 09:56:16 compute-1 sudo[190491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:16 compute-1 python3.9[190493]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:17 compute-1 sudo[190491]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:17.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:17 compute-1 ceph-mon[79770]: pgmap v443: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:56:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:17.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:17 compute-1 sudo[190643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqpmpcfoiyuogfsygnquvfcpjownllvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014977.1744168-2928-55556939989720/AnsiballZ_copy.py'
Dec 06 09:56:17 compute-1 sudo[190643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:17 compute-1 python3.9[190645]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:17 compute-1 sudo[190643]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:18 compute-1 sudo[190798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtcjxdqxpbvrrnatrbjwmyxrngzsfgcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014978.0717247-3036-6654213455644/AnsiballZ_copy.py'
Dec 06 09:56:18 compute-1 sudo[190798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:56:18 compute-1 python3.9[190800]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:18 compute-1 sudo[190798]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:18 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:18 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:56:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:18 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:56:19 compute-1 sudo[190950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrolfkgapvillfpbxbliokqmdvrfpngb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014978.7339537-3036-184548452305908/AnsiballZ_copy.py'
Dec 06 09:56:19 compute-1 sudo[190950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:19.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:19 compute-1 python3.9[190952]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:19 compute-1 sudo[190950]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:19 compute-1 ceph-mon[79770]: pgmap v444: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 426 B/s wr, 1 op/s
Dec 06 09:56:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:19.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:19 compute-1 sudo[191102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbrpwndbziyradpmjprwhyjywtnjeess ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014979.380517-3036-142850701716277/AnsiballZ_copy.py'
Dec 06 09:56:19 compute-1 sudo[191102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:19 compute-1 python3.9[191104]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:19 compute-1 sudo[191102]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:20 compute-1 sudo[191255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrhlkbmzbuthvfkfoewtvzpytqmlaata ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014979.9963198-3036-243468686605352/AnsiballZ_copy.py'
Dec 06 09:56:20 compute-1 sudo[191255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:20 compute-1 python3.9[191257]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:20 compute-1 sudo[191255]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:20 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095620 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:56:20 compute-1 sudo[191407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldvcbpziusgtytblzvzdmyprddyjdzyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014980.6209116-3036-152002179685606/AnsiballZ_copy.py'
Dec 06 09:56:20 compute-1 sudo[191407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:21 compute-1 python3.9[191409]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:21 compute-1 sudo[191407]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:21.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:21 compute-1 ceph-mon[79770]: pgmap v445: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 426 B/s wr, 1 op/s
Dec 06 09:56:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:21.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:22 compute-1 sudo[191559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbejrhioshhhujrtjodchvknkwpuvomm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014981.9097307-3144-209510528976431/AnsiballZ_systemd.py'
Dec 06 09:56:22 compute-1 sudo[191559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:22 compute-1 podman[191561]: 2025-12-06 09:56:22.356116776 +0000 UTC m=+0.113497436 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 09:56:22 compute-1 python3.9[191563]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:56:22 compute-1 systemd[1]: Reloading.
Dec 06 09:56:22 compute-1 systemd-rc-local-generator[191615]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:56:22 compute-1 systemd-sysv-generator[191619]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:56:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:22 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:22 compute-1 systemd[1]: Starting libvirt logging daemon socket...
Dec 06 09:56:22 compute-1 systemd[1]: Listening on libvirt logging daemon socket.
Dec 06 09:56:22 compute-1 systemd[1]: Starting libvirt logging daemon admin socket...
Dec 06 09:56:22 compute-1 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 06 09:56:22 compute-1 systemd[1]: Starting libvirt logging daemon...
Dec 06 09:56:23 compute-1 systemd[1]: Started libvirt logging daemon.
Dec 06 09:56:23 compute-1 sudo[191559]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:23.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:23.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:23 compute-1 ceph-mon[79770]: pgmap v446: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 426 B/s wr, 1 op/s
Dec 06 09:56:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:56:23 compute-1 sudo[191780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ittqrizbrgfhvotjnichjryuvbyjrbzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014983.2041445-3144-35706484406260/AnsiballZ_systemd.py'
Dec 06 09:56:23 compute-1 sudo[191780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:23 compute-1 python3.9[191782]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:56:23 compute-1 systemd[1]: Reloading.
Dec 06 09:56:23 compute-1 systemd-sysv-generator[191809]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:56:23 compute-1 systemd-rc-local-generator[191804]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:56:24 compute-1 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 06 09:56:24 compute-1 systemd[1]: Starting libvirt nodedev daemon socket...
Dec 06 09:56:24 compute-1 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 06 09:56:24 compute-1 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 06 09:56:24 compute-1 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 06 09:56:24 compute-1 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 06 09:56:24 compute-1 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 06 09:56:24 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Dec 06 09:56:24 compute-1 systemd[1]: Started libvirt nodedev daemon.
Dec 06 09:56:24 compute-1 sudo[191780]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:56:24 compute-1 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 06 09:56:24 compute-1 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 06 09:56:24 compute-1 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 06 09:56:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:24 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:24 compute-1 sudo[192004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbhzighhwuzdufulibugzkknsznaegwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014984.4765055-3144-173928717652368/AnsiballZ_systemd.py'
Dec 06 09:56:24 compute-1 sudo[192004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:25 compute-1 python3.9[192006]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:56:25 compute-1 systemd[1]: Reloading.
Dec 06 09:56:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:25.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:25 compute-1 systemd-sysv-generator[192039]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:56:25 compute-1 systemd-rc-local-generator[192035]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:56:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:25.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0001fc0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:25 compute-1 ceph-mon[79770]: pgmap v447: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 09:56:25 compute-1 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 06 09:56:25 compute-1 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 06 09:56:25 compute-1 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 06 09:56:25 compute-1 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 06 09:56:25 compute-1 systemd[1]: Starting libvirt proxy daemon...
Dec 06 09:56:25 compute-1 setroubleshoot[191818]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 0aa677db-8f04-42d4-9355-b01c7fb3c0b5
Dec 06 09:56:25 compute-1 systemd[1]: Started libvirt proxy daemon.
Dec 06 09:56:25 compute-1 setroubleshoot[191818]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 06 09:56:25 compute-1 setroubleshoot[191818]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 0aa677db-8f04-42d4-9355-b01c7fb3c0b5
Dec 06 09:56:25 compute-1 setroubleshoot[191818]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 06 09:56:25 compute-1 sudo[192004]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:26 compute-1 sudo[192218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmwbrqggmcdzlyfnpdpykarxhoczbtvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014985.94766-3144-120287074289526/AnsiballZ_systemd.py'
Dec 06 09:56:26 compute-1 sudo[192218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:26 compute-1 python3.9[192220]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:56:26 compute-1 systemd[1]: Reloading.
Dec 06 09:56:26 compute-1 systemd-rc-local-generator[192248]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:56:26 compute-1 systemd-sysv-generator[192252]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:56:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:26 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:26 compute-1 systemd[1]: Listening on libvirt locking daemon socket.
Dec 06 09:56:26 compute-1 systemd[1]: Starting libvirt QEMU daemon socket...
Dec 06 09:56:26 compute-1 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 06 09:56:26 compute-1 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 06 09:56:26 compute-1 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 06 09:56:26 compute-1 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 06 09:56:26 compute-1 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 06 09:56:26 compute-1 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 06 09:56:26 compute-1 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 06 09:56:26 compute-1 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 06 09:56:26 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Dec 06 09:56:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:26 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:56:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:56:27 compute-1 systemd[1]: Started libvirt QEMU daemon.
Dec 06 09:56:27 compute-1 sudo[192218]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 09:56:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:27.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 09:56:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:27.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:27 compute-1 sudo[192445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-popjgmjusilzunijiixnystxubovecto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014987.2026188-3144-224998940023754/AnsiballZ_systemd.py'
Dec 06 09:56:27 compute-1 sudo[192445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:27 compute-1 podman[192406]: 2025-12-06 09:56:27.555588207 +0000 UTC m=+0.078642915 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 09:56:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:27 compute-1 ceph-mon[79770]: pgmap v448: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 09:56:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:27 compute-1 python3.9[192453]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:56:27 compute-1 systemd[1]: Reloading.
Dec 06 09:56:27 compute-1 systemd-rc-local-generator[192481]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:56:27 compute-1 systemd-sysv-generator[192485]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:56:28 compute-1 systemd[1]: Starting libvirt secret daemon socket...
Dec 06 09:56:28 compute-1 systemd[1]: Listening on libvirt secret daemon socket.
Dec 06 09:56:28 compute-1 systemd[1]: Starting libvirt secret daemon admin socket...
Dec 06 09:56:28 compute-1 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 06 09:56:28 compute-1 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 06 09:56:28 compute-1 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 06 09:56:28 compute-1 systemd[1]: Starting libvirt secret daemon...
Dec 06 09:56:28 compute-1 systemd[1]: Started libvirt secret daemon.
Dec 06 09:56:28 compute-1 sudo[192445]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:56:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:28 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:28 compute-1 sudo[192539]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:56:28 compute-1 sudo[192539]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:56:28 compute-1 sudo[192539]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:29 compute-1 sudo[192689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdqnlsfwiuoigbqbhlqrexfwstrzfqux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014988.791929-3255-196874302036964/AnsiballZ_file.py'
Dec 06 09:56:29 compute-1 sudo[192689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:29.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:29.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:29 compute-1 python3.9[192691]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:29 compute-1 sudo[192689]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:29 compute-1 ceph-mon[79770]: pgmap v449: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 767 B/s wr, 3 op/s
Dec 06 09:56:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:29 compute-1 sudo[192841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esmecgdpfvbwfsbwroeulxtnmixnaqgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014989.5840912-3279-183155141712156/AnsiballZ_find.py'
Dec 06 09:56:29 compute-1 sudo[192841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:56:30 compute-1 python3.9[192843]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:56:30 compute-1 sudo[192841]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:30 compute-1 ceph-mon[79770]: pgmap v450: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 341 B/s wr, 2 op/s
Dec 06 09:56:30 compute-1 sudo[192994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nepnhlxtvsqtyphiowctvhyjanvjlkfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014990.5462313-3304-97076395208505/AnsiballZ_command.py'
Dec 06 09:56:30 compute-1 sudo[192994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:31 compute-1 python3.9[192996]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                             echo ceph
                                             awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:56:31 compute-1 sudo[192994]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:31.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:31.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:31 compute-1 python3.9[193150]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:56:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:32 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:32 compute-1 python3.9[193301]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:56:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:33.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:33 compute-1 ceph-mon[79770]: pgmap v451: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 341 B/s wr, 2 op/s
Dec 06 09:56:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:33.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:56:33 compute-1 python3.9[193422]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014992.3877494-3360-250846414704821/.source.xml follow=False _original_basename=secret.xml.j2 checksum=f7c948a7651e1e704e9fb6c67bea136c2b7876ec backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:34 compute-1 sudo[193572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eewnnmgsbruvjlwfsqyemoaminsykjlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014993.7235498-3405-101999480039584/AnsiballZ_command.py'
Dec 06 09:56:34 compute-1 sudo[193572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:34 compute-1 python3.9[193574]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 5ecd3f74-dade-5fc4-92ce-8950ae424258
                                             virsh secret-define --file /tmp/secret.xml
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:56:34 compute-1 polkitd[43440]: Registered Authentication Agent for unix-process:193577:336695 (system bus name :1.1829 [pkttyagent --process 193577 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Dec 06 09:56:34 compute-1 polkitd[43440]: Unregistered Authentication Agent for unix-process:193577:336695 (system bus name :1.1829, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Dec 06 09:56:34 compute-1 polkitd[43440]: Registered Authentication Agent for unix-process:193576:336694 (system bus name :1.1830 [pkttyagent --process 193576 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Dec 06 09:56:34 compute-1 polkitd[43440]: Unregistered Authentication Agent for unix-process:193576:336694 (system bus name :1.1830, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Dec 06 09:56:34 compute-1 sudo[193572]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:34 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:35 compute-1 python3.9[193737]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:35.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:35 compute-1 ceph-mon[79770]: pgmap v452: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 597 B/s wr, 2 op/s
Dec 06 09:56:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:35.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095635 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:56:35 compute-1 sudo[193887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chwxjigxmwvtvnwtkfmfiaqfmxtvpahw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014995.385037-3453-164480246742781/AnsiballZ_command.py'
Dec 06 09:56:35 compute-1 sudo[193887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:35 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 06 09:56:35 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.044s CPU time.
Dec 06 09:56:35 compute-1 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 06 09:56:35 compute-1 sudo[193887]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:56:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:56:36 compute-1 sudo[194041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enyveodenzlycvyqflupdqksjaufopgc ; FSID=5ecd3f74-dade-5fc4-92ce-8950ae424258 KEY=AQA7+TNpAAAAABAABZDZy1tS5Qay3mTps8dAWg== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014996.2334592-3477-255165610609091/AnsiballZ_command.py'
Dec 06 09:56:36 compute-1 sudo[194041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:36 compute-1 polkitd[43440]: Registered Authentication Agent for unix-process:194044:336942 (system bus name :1.1833 [pkttyagent --process 194044 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Dec 06 09:56:36 compute-1 polkitd[43440]: Unregistered Authentication Agent for unix-process:194044:336942 (system bus name :1.1833, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Dec 06 09:56:36 compute-1 sudo[194041]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:37.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:37 compute-1 ceph-mon[79770]: pgmap v453: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 597 B/s wr, 2 op/s
Dec 06 09:56:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:37.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:37 compute-1 sudo[194199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieluqcvsgkjwfnldkbmtepuqltbjhxaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014997.0979567-3501-107573468765763/AnsiballZ_copy.py'
Dec 06 09:56:37 compute-1 sudo[194199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:37 compute-1 python3.9[194201]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:37 compute-1 sudo[194199]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:38 compute-1 sudo[194351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfnsfhahakejezfhihqatjpvxvlbgnep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014997.8608117-3525-221189080822303/AnsiballZ_stat.py'
Dec 06 09:56:38 compute-1 sudo[194351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:38 compute-1 python3.9[194353]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:38 compute-1 sudo[194351]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:56:38 compute-1 sudo[194475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poqvzdckdljcddjdqzclhroamhnytrzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014997.8608117-3525-221189080822303/AnsiballZ_copy.py'
Dec 06 09:56:38 compute-1 sudo[194475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:38 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:38 compute-1 python3.9[194477]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765014997.8608117-3525-221189080822303/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:38 compute-1 sudo[194475]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:56:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:39.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:39 compute-1 ceph-mon[79770]: pgmap v454: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.5 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Dec 06 09:56:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:56:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:39.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:39 compute-1 sudo[194627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuqlepujwlpeacdakgoyxnocezxrrtpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765014999.2982597-3573-45709712481237/AnsiballZ_file.py'
Dec 06 09:56:39 compute-1 sudo[194627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:39 compute-1 python3.9[194629]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:39 compute-1 sudo[194627]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:40 compute-1 sudo[194780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adssnhranlhgikjjwinqnpiervidnfhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015000.2235136-3597-86121914799305/AnsiballZ_stat.py'
Dec 06 09:56:40 compute-1 sudo[194780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:40 compute-1 python3.9[194782]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:40 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:40 compute-1 sudo[194780]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:41 compute-1 sudo[194858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jefuacqckcphknohoonsbxyvtlurmlsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015000.2235136-3597-86121914799305/AnsiballZ_file.py'
Dec 06 09:56:41 compute-1 sudo[194858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:41.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:41 compute-1 python3.9[194860]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:41 compute-1 sudo[194858]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:41 compute-1 ceph-mon[79770]: pgmap v455: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 09:56:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:41.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:41 compute-1 sudo[195010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xucmlsodgvnidtqgvoakmwimfehdgabd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015001.4637804-3633-76640544745393/AnsiballZ_stat.py'
Dec 06 09:56:41 compute-1 sudo[195010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:41 compute-1 python3.9[195012]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:42 compute-1 sudo[195010]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:42 compute-1 sudo[195089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umliygnkdxfsfbdgfgrkearyxmurxijt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015001.4637804-3633-76640544745393/AnsiballZ_file.py'
Dec 06 09:56:42 compute-1 sudo[195089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:42 compute-1 python3.9[195091]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.hnm4iwsp recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:42 compute-1 sudo[195089]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:42 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095642 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:56:43 compute-1 sudo[195241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otkntkqnalbjjssrlxzrevzlocwiraxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015002.7743342-3669-88557329680458/AnsiballZ_stat.py'
Dec 06 09:56:43 compute-1 sudo[195241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:43.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:43 compute-1 python3.9[195243]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:43 compute-1 sudo[195241]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:43 compute-1 ceph-mon[79770]: pgmap v456: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 09:56:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:43.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.372964) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003373131, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 542, "num_deletes": 251, "total_data_size": 892969, "memory_usage": 903152, "flush_reason": "Manual Compaction"}
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003380179, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 589687, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17606, "largest_seqno": 18143, "table_properties": {"data_size": 586862, "index_size": 861, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6542, "raw_average_key_size": 18, "raw_value_size": 581332, "raw_average_value_size": 1665, "num_data_blocks": 39, "num_entries": 349, "num_filter_entries": 349, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014971, "oldest_key_time": 1765014971, "file_creation_time": 1765015003, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 7231 microseconds, and 3707 cpu microseconds.
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.380225) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 589687 bytes OK
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.380242) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.381553) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.381566) EVENT_LOG_v1 {"time_micros": 1765015003381562, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.381586) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 889836, prev total WAL file size 889836, number of live WAL files 2.
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.382222) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(575KB)], [30(13MB)]
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003382346, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 14526008, "oldest_snapshot_seqno": -1}
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4870 keys, 12334259 bytes, temperature: kUnknown
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003472814, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 12334259, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12300996, "index_size": 19969, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 123201, "raw_average_key_size": 25, "raw_value_size": 12211804, "raw_average_value_size": 2507, "num_data_blocks": 830, "num_entries": 4870, "num_filter_entries": 4870, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015003, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.473181) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 12334259 bytes
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.474717) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 160.4 rd, 136.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 13.3 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(45.6) write-amplify(20.9) OK, records in: 5380, records dropped: 510 output_compression: NoCompression
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.474738) EVENT_LOG_v1 {"time_micros": 1765015003474728, "job": 16, "event": "compaction_finished", "compaction_time_micros": 90567, "compaction_time_cpu_micros": 27597, "output_level": 6, "num_output_files": 1, "total_output_size": 12334259, "num_input_records": 5380, "num_output_records": 4870, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003474967, "job": 16, "event": "table_file_deletion", "file_number": 32}
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015003477527, "job": 16, "event": "table_file_deletion", "file_number": 30}
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.382083) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.477658) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.477668) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.477671) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.477674) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:56:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:56:43.477677) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:56:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:56:43 compute-1 sudo[195319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krgnovdyycknvtlyafpuwrejqbuhtayd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015002.7743342-3669-88557329680458/AnsiballZ_file.py'
Dec 06 09:56:43 compute-1 sudo[195319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:43 compute-1 python3.9[195321]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:43 compute-1 sudo[195319]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:44 compute-1 sudo[195472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovpyukoncmfjeudllgscdwjfcviuhbok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015004.1574938-3708-46982551817620/AnsiballZ_command.py'
Dec 06 09:56:44 compute-1 sudo[195472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:44 compute-1 python3.9[195474]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:56:44 compute-1 sudo[195472]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:44 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:45.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:45.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:45 compute-1 sudo[195625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qclsembwlyxgkazayiskxxmjhrwgrils ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765015004.9475725-3733-58269936364551/AnsiballZ_edpm_nftables_from_files.py'
Dec 06 09:56:45 compute-1 sudo[195625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:45 compute-1 ceph-mon[79770]: pgmap v457: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 09:56:45 compute-1 python3[195627]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 06 09:56:45 compute-1 sudo[195625]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:46 compute-1 sudo[195777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuytwdqwnzafikrbjibaibtzcfheefpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015005.8318744-3756-229159711011374/AnsiballZ_stat.py'
Dec 06 09:56:46 compute-1 sudo[195777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:46 compute-1 python3.9[195779]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:46 compute-1 sudo[195777]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:46 compute-1 sudo[195856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxfyawsoxygveqtyaujtnjcsqvnvnywg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015005.8318744-3756-229159711011374/AnsiballZ_file.py'
Dec 06 09:56:46 compute-1 sudo[195856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:46 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:46 compute-1 python3.9[195858]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:46 compute-1 sudo[195856]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:47.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:47.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:47 compute-1 ceph-mon[79770]: pgmap v458: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Dec 06 09:56:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:56:48 compute-1 sudo[196010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmwotbvwoajgcdipwwmtgkpdnvtiocqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015008.182302-3793-227413088339672/AnsiballZ_stat.py'
Dec 06 09:56:48 compute-1 sudo[196010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:48 compute-1 python3.9[196012]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:48 compute-1 sudo[196010]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:48 compute-1 auditd[703]: Audit daemon rotating log files
Dec 06 09:56:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:48 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:48 compute-1 sudo[196040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:56:48 compute-1 sudo[196040]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:56:48 compute-1 sudo[196040]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:48 compute-1 sudo[196115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmuyfwymoqegidchudakuommncyrymny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015008.182302-3793-227413088339672/AnsiballZ_file.py'
Dec 06 09:56:48 compute-1 sudo[196115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:49 compute-1 python3.9[196117]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:49.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:49 compute-1 sudo[196115]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:49.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:49 compute-1 ceph-mon[79770]: pgmap v459: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Dec 06 09:56:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002a80 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:49 compute-1 sudo[196267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzfnnsufwzeugcwxuxpfawinmsrrkkjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015009.3964386-3828-164896133680914/AnsiballZ_stat.py'
Dec 06 09:56:49 compute-1 sudo[196267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:49 compute-1 python3.9[196269]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:49 compute-1 sudo[196267]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:50 compute-1 sudo[196345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jziztyiialwgrqzrmnuggzpnrwkvwqbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015009.3964386-3828-164896133680914/AnsiballZ_file.py'
Dec 06 09:56:50 compute-1 sudo[196345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:50 compute-1 python3.9[196347]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:50 compute-1 sudo[196345]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:50 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:50 compute-1 sudo[196498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfpcwnglencvzlmbzsdukqnraxkgpxfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015010.6688662-3864-269704492396953/AnsiballZ_stat.py'
Dec 06 09:56:50 compute-1 sudo[196498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:51 compute-1 python3.9[196500]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:51 compute-1 sudo[196498]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:51.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:56:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:51.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:56:51 compute-1 ceph-mon[79770]: pgmap v460: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Dec 06 09:56:51 compute-1 sudo[196576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nakdjkpgspmubowurlaswyutrqnywklg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015010.6688662-3864-269704492396953/AnsiballZ_file.py'
Dec 06 09:56:51 compute-1 sudo[196576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:51 compute-1 python3.9[196578]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:51 compute-1 sudo[196576]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:52 compute-1 sudo[196752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjckzsyryseqdhcycsmpevgganqzdtjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015011.925789-3900-41308981290846/AnsiballZ_stat.py'
Dec 06 09:56:52 compute-1 sudo[196707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:56:52 compute-1 sudo[196707]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:56:52 compute-1 sudo[196752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:52 compute-1 sudo[196707]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:52 compute-1 sudo[196757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:56:52 compute-1 sudo[196757]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:56:52 compute-1 python3.9[196756]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:52 compute-1 sudo[196752]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:52 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c002780 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:52 compute-1 podman[196848]: 2025-12-06 09:56:52.800002989 +0000 UTC m=+0.097181359 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:56:52 compute-1 sudo[196757]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:52 compute-1 sudo[196962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vleclzvodksrydotvywtwxlyzkpxhbgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015011.925789-3900-41308981290846/AnsiballZ_copy.py'
Dec 06 09:56:52 compute-1 sudo[196962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:53 compute-1 python3.9[196964]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765015011.925789-3900-41308981290846/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:53 compute-1 sudo[196962]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:53.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:53.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:53 compute-1 ceph-mon[79770]: pgmap v461: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Dec 06 09:56:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:56:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:56:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:56:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:56:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:56:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:56:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:56:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:56:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:53 compute-1 sudo[197114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojshjvsimckwpfhbxcabbnbwrljfknrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015013.5109873-3945-37322140457171/AnsiballZ_file.py'
Dec 06 09:56:53 compute-1 sudo[197114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140025c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:54 compute-1 python3.9[197116]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:54 compute-1 sudo[197114]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:56:54.266 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:56:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:56:54.268 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:56:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:56:54.268 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:56:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:56:54 compute-1 sudo[197267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avmtshuswqswbnlduaxamutvjyfcwjow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015014.2409337-3969-161354100221157/AnsiballZ_command.py'
Dec 06 09:56:54 compute-1 sudo[197267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:54 compute-1 python3.9[197269]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:56:54 compute-1 sudo[197267]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:54 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:55.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:55 compute-1 sudo[197422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jairkeouhxjcapsifxrhwciedelashnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015014.9265823-3993-213757051730030/AnsiballZ_blockinfile.py'
Dec 06 09:56:55 compute-1 sudo[197422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:55.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:55 compute-1 python3.9[197424]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:55 compute-1 ceph-mon[79770]: pgmap v462: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Dec 06 09:56:55 compute-1 sudo[197422]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c003160 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:56 compute-1 sudo[197574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iprhtzuvdsfiiwhxcahdndceepuwxztn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015015.9097767-4020-170647237076625/AnsiballZ_command.py'
Dec 06 09:56:56 compute-1 sudo[197574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:56 compute-1 python3.9[197576]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:56:56 compute-1 sudo[197574]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:56 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:57 compute-1 sudo[197728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tipbqzdlpiffmbgvptohelbiimclntnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015016.726361-4044-156205028623502/AnsiballZ_stat.py'
Dec 06 09:56:57 compute-1 sudo[197728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:57 compute-1 python3.9[197730]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:56:57 compute-1 sudo[197728]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:56:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:57.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:56:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:57.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:57 compute-1 ceph-mon[79770]: pgmap v463: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:56:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:57 compute-1 podman[197832]: 2025-12-06 09:56:57.758691153 +0000 UTC m=+0.058491972 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec 06 09:56:57 compute-1 sudo[197901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyewmwgrsymromjauxphsdrgoggzqnzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015017.5477676-4069-96020993297347/AnsiballZ_command.py'
Dec 06 09:56:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c003160 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:57 compute-1 sudo[197901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:58 compute-1 python3.9[197903]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:56:58 compute-1 sudo[197901]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:58 compute-1 sudo[197932]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:56:58 compute-1 sudo[197932]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:56:58 compute-1 sudo[197932]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:56:58 compute-1 sudo[198082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeoavxjsvcfqxgnnzlxqovpkivsibqjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015018.3022127-4092-85452108237816/AnsiballZ_file.py'
Dec 06 09:56:58 compute-1 sudo[198082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:58 compute-1 python3.9[198084]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:56:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:58 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:58 compute-1 sudo[198082]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:56:59 compute-1 ceph-mon[79770]: pgmap v464: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:56:59 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:56:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:56:59.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:59 compute-1 sudo[198234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpkdpbpzbcdyvffrphevfvoexkcpptvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015019.058871-4116-133551360775595/AnsiballZ_stat.py'
Dec 06 09:56:59 compute-1 sudo[198234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:56:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:56:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:56:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:56:59.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:56:59 compute-1 python3.9[198236]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:56:59 compute-1 sudo[198234]: pam_unix(sudo:session): session closed for user root
Dec 06 09:56:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:56:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:56:59 compute-1 sudo[198357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yazizxikqrqpgcdwyjbowvmsspmtlxrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015019.058871-4116-133551360775595/AnsiballZ_copy.py'
Dec 06 09:56:59 compute-1 sudo[198357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:00 compute-1 python3.9[198359]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015019.058871-4116-133551360775595/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:00 compute-1 sudo[198357]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:00 compute-1 sudo[198510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auehpdwmxglfwoolqkqjrwgbwitlllim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015020.4203186-4162-158177263821505/AnsiballZ_stat.py'
Dec 06 09:57:00 compute-1 sudo[198510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:00 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:00 compute-1 python3.9[198512]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:00 compute-1 sudo[198510]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:01 compute-1 ceph-mon[79770]: pgmap v465: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:01.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:01 compute-1 sudo[198633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptqxvbpcypbkjemmbfzshmybfslufosa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015020.4203186-4162-158177263821505/AnsiballZ_copy.py'
Dec 06 09:57:01 compute-1 sudo[198633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:01.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:01 compute-1 python3.9[198635]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015020.4203186-4162-158177263821505/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:01 compute-1 sudo[198633]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:01 compute-1 sudo[198785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krwlokbovvewjgepmymrbbuzqyjzvaie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015021.6784155-4206-79333795060797/AnsiballZ_stat.py'
Dec 06 09:57:01 compute-1 sudo[198785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:02 compute-1 python3.9[198787]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:02 compute-1 sudo[198785]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:02 compute-1 sudo[198909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emzyxgftvrpebozlaucpqvtsdjkocqyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015021.6784155-4206-79333795060797/AnsiballZ_copy.py'
Dec 06 09:57:02 compute-1 sudo[198909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:02 compute-1 python3.9[198911]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015021.6784155-4206-79333795060797/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:02 compute-1 sudo[198909]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:02 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:03 compute-1 ceph-mon[79770]: pgmap v466: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:03.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:03 compute-1 sudo[199061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caggcmemwoailggpwyhegdiomvgrjdec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015022.9861555-4251-207071306201720/AnsiballZ_systemd.py'
Dec 06 09:57:03 compute-1 sudo[199061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:03.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:57:03 compute-1 python3.9[199063]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:03 compute-1 systemd[1]: Reloading.
Dec 06 09:57:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:03 compute-1 systemd-sysv-generator[199094]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:57:03 compute-1 systemd-rc-local-generator[199089]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:57:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:03 compute-1 systemd[1]: Reached target edpm_libvirt.target.
Dec 06 09:57:03 compute-1 sudo[199061]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:04 compute-1 sudo[199253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgjakzcdtjvodwbkrcheyaifblmtqfqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015024.2563596-4275-99625990756551/AnsiballZ_systemd.py'
Dec 06 09:57:04 compute-1 sudo[199253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:04 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8004000 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:04 compute-1 python3.9[199255]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 06 09:57:04 compute-1 systemd[1]: Reloading.
Dec 06 09:57:04 compute-1 systemd-sysv-generator[199283]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:57:04 compute-1 systemd-rc-local-generator[199278]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:57:05 compute-1 ceph-mon[79770]: pgmap v467: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:57:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:05.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:05 compute-1 systemd[1]: Reloading.
Dec 06 09:57:05 compute-1 systemd-rc-local-generator[199319]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:57:05 compute-1 systemd-sysv-generator[199322]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:57:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:05.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:05 compute-1 sudo[199253]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c003e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:06 compute-1 sshd-session[141573]: Connection closed by 192.168.122.30 port 40528
Dec 06 09:57:06 compute-1 sshd-session[141569]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:57:06 compute-1 systemd[1]: session-52.scope: Deactivated successfully.
Dec 06 09:57:06 compute-1 systemd[1]: session-52.scope: Consumed 3min 45.764s CPU time.
Dec 06 09:57:06 compute-1 systemd-logind[788]: Session 52 logged out. Waiting for processes to exit.
Dec 06 09:57:06 compute-1 systemd-logind[788]: Removed session 52.
Dec 06 09:57:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:06 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:07.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:07 compute-1 ceph-mon[79770]: pgmap v468: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:07.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:57:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:08 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:09 compute-1 sudo[199353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:57:09 compute-1 sudo[199353]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:57:09 compute-1 sudo[199353]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:09.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:09 compute-1 ceph-mon[79770]: pgmap v469: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:57:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:09.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:10 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:11.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:11.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:11 compute-1 ceph-mon[79770]: pgmap v470: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:11 compute-1 sshd-session[199379]: Accepted publickey for zuul from 192.168.122.30 port 40024 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:57:11 compute-1 systemd-logind[788]: New session 53 of user zuul.
Dec 06 09:57:11 compute-1 systemd[1]: Started Session 53 of User zuul.
Dec 06 09:57:11 compute-1 sshd-session[199379]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:57:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:12 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:12 compute-1 python3.9[199533]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:57:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:13.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:13.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:57:13 compute-1 ceph-mon[79770]: pgmap v471: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:14 compute-1 python3.9[199687]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:57:14 compute-1 network[199705]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:57:14 compute-1 network[199706]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:57:14 compute-1 network[199707]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:57:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:14 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:15.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:15.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:15 compute-1 ceph-mon[79770]: pgmap v472: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:57:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:16 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:16 compute-1 ceph-mon[79770]: pgmap v473: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:17.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:17.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:18 compute-1 sudo[199979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfxhafamrfjdlmuldyukrvnuabymbbgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015038.0637908-102-271671166342220/AnsiballZ_setup.py'
Dec 06 09:57:18 compute-1 sudo[199979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:57:18 compute-1 python3.9[199981]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 06 09:57:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:18 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0003c70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:18 compute-1 sudo[199979]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:19.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:19 compute-1 sudo[200063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnooysvyfisiqeohmcrvaxbjpfqczsdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015038.0637908-102-271671166342220/AnsiballZ_dnf.py'
Dec 06 09:57:19 compute-1 sudo[200063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:19.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:19 compute-1 ceph-mon[79770]: pgmap v474: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:19 compute-1 python3.9[200065]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:57:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:20 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:21.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:21.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:21 compute-1 ceph-mon[79770]: pgmap v475: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:22 compute-1 ceph-mon[79770]: pgmap v476: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:22 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:23.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:23.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:57:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:23 compute-1 podman[200070]: 2025-12-06 09:57:23.818647211 +0000 UTC m=+0.119465786 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 06 09:57:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:23 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:57:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:24 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c004f70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:25 compute-1 ceph-mon[79770]: pgmap v477: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:57:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:25.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:25.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:25 compute-1 sudo[200063]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:26 compute-1 sudo[200246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gujlzjcawsfgiapbtedosbjtwhcaggtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015045.6527622-138-90018900102256/AnsiballZ_stat.py'
Dec 06 09:57:26 compute-1 sudo[200246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:26 compute-1 python3.9[200248]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:57:26 compute-1 sudo[200246]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:26 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:27 compute-1 sudo[200399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfskigxtazqcmswjhczektyfaoagrnfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015046.6446748-168-160581570647248/AnsiballZ_command.py'
Dec 06 09:57:27 compute-1 sudo[200399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:27.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:27 compute-1 python3.9[200401]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:27 compute-1 sudo[200399]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:27.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:27 compute-1 ceph-mon[79770]: pgmap v478: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:28 compute-1 sudo[200564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kewvnmhwjfaigvmrpmonjdhgacoijcnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015047.8241305-198-77575581929944/AnsiballZ_stat.py'
Dec 06 09:57:28 compute-1 sudo[200564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:28 compute-1 podman[200526]: 2025-12-06 09:57:28.13175862 +0000 UTC m=+0.065037937 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 09:57:28 compute-1 python3.9[200572]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:57:28 compute-1 sudo[200564]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:57:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:28 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:28 compute-1 sudo[200725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edjhhmefnogfyhdvnggzbewbchyhcuwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015048.5358994-222-62868003219366/AnsiballZ_command.py'
Dec 06 09:57:28 compute-1 sudo[200725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:29 compute-1 python3.9[200727]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:29 compute-1 sudo[200725]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:29 compute-1 ceph-mon[79770]: pgmap v479: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:29 compute-1 sudo[200730]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:57:29 compute-1 sudo[200730]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:57:29 compute-1 sudo[200730]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:29.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:29.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:29 compute-1 sudo[200903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrrmfkchkobbkdcprysupocxnkytpvov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015049.3007045-246-26049286193805/AnsiballZ_stat.py'
Dec 06 09:57:29 compute-1 sudo[200903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:29 compute-1 python3.9[200905]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:29 compute-1 sudo[200903]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:30 compute-1 sudo[201027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdsnzdghbozhoendxhltqfehbudadfhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015049.3007045-246-26049286193805/AnsiballZ_copy.py'
Dec 06 09:57:30 compute-1 sudo[201027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:30 compute-1 python3.9[201029]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015049.3007045-246-26049286193805/.source.iscsi _original_basename=.r3uie0hx follow=False checksum=eaccd56aaf590b98db17b6975888b71367194346 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:30 compute-1 sudo[201027]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:31 compute-1 sudo[201179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiyjmjfxasndvxmhsqtgubfsudwnmyja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015050.811192-291-112656429133116/AnsiballZ_file.py'
Dec 06 09:57:31 compute-1 sudo[201179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:31 compute-1 ceph-mon[79770]: pgmap v480: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:31.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:31 compute-1 python3.9[201181]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:31.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:31 compute-1 sudo[201179]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:32 compute-1 sudo[201331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibffqnrjvlezycknidkmehhcaxzbdfhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015051.6814413-315-230946965388507/AnsiballZ_lineinfile.py'
Dec 06 09:57:32 compute-1 sudo[201331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:32 compute-1 python3.9[201333]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:32 compute-1 sudo[201331]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:32 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:33 compute-1 sudo[201484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flgegmtekerxkluwhbxmqzjqmvufdawu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015052.586212-342-93440922926466/AnsiballZ_systemd_service.py'
Dec 06 09:57:33 compute-1 sudo[201484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:33 compute-1 ceph-mon[79770]: pgmap v481: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:33.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:33.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:33 compute-1 python3.9[201486]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:57:33 compute-1 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 06 09:57:33 compute-1 sudo[201484]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:34 compute-1 sudo[201640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvphpkzsvjwfdepjuxkokhoragmklexm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015053.915253-366-136371300184910/AnsiballZ_systemd_service.py'
Dec 06 09:57:34 compute-1 sudo[201640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:34 compute-1 python3.9[201642]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:57:34 compute-1 systemd[1]: Reloading.
Dec 06 09:57:34 compute-1 systemd-sysv-generator[201676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:57:34 compute-1 systemd-rc-local-generator[201673]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:57:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:34 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:34 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 06 09:57:34 compute-1 systemd[1]: Starting Open-iSCSI...
Dec 06 09:57:34 compute-1 kernel: Loading iSCSI transport class v2.0-870.
Dec 06 09:57:34 compute-1 systemd[1]: Started Open-iSCSI.
Dec 06 09:57:34 compute-1 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 06 09:57:34 compute-1 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 06 09:57:34 compute-1 sudo[201640]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:35.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:35 compute-1 ceph-mon[79770]: pgmap v482: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:57:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:35.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:35 compute-1 sudo[201842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssiovcphuxvfbjcyrfvsliltdycyeqph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015055.5055306-399-109367970013402/AnsiballZ_service_facts.py'
Dec 06 09:57:35 compute-1 sudo[201842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:35 compute-1 python3.9[201844]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:57:36 compute-1 network[201861]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:57:36 compute-1 network[201862]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:57:36 compute-1 network[201863]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:57:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:37.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:37.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:37 compute-1 ceph-mon[79770]: pgmap v483: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:57:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:38 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095738 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:57:39 compute-1 sudo[201842]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:39.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:39.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:39 compute-1 ceph-mon[79770]: pgmap v484: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:57:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:40 compute-1 sudo[202136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btgbsmdntgpfuqyabzecnehknlmumfvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015059.831841-429-230755805909987/AnsiballZ_file.py'
Dec 06 09:57:40 compute-1 sudo[202136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:40 compute-1 python3.9[202138]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 09:57:40 compute-1 sudo[202136]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:40 compute-1 ceph-mon[79770]: pgmap v485: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:40 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:41 compute-1 sudo[202289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulkwblctglcmnuynkfhxhprvsnirriez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015060.7310588-453-184788044735990/AnsiballZ_modprobe.py'
Dec 06 09:57:41 compute-1 sudo[202289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:41.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:41 compute-1 python3.9[202291]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 06 09:57:41 compute-1 sudo[202289]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:41.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:41 compute-1 sudo[202445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjnzvkivjabgufjnjlkzyhpnkrphjqda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015061.6562243-477-13239230047815/AnsiballZ_stat.py'
Dec 06 09:57:41 compute-1 sudo[202445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:42 compute-1 python3.9[202447]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:42 compute-1 sudo[202445]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:42 compute-1 sudo[202569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvupisbkhpllywohdfmvholvscypiarv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015061.6562243-477-13239230047815/AnsiballZ_copy.py'
Dec 06 09:57:42 compute-1 sudo[202569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:42 compute-1 python3.9[202571]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015061.6562243-477-13239230047815/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:42 compute-1 sudo[202569]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:42 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:43.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:43 compute-1 sudo[202721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcewgfnttkmqtoorxctrqciuwhktxzpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015063.073154-525-263794379366595/AnsiballZ_lineinfile.py'
Dec 06 09:57:43 compute-1 sudo[202721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:43 compute-1 ceph-mon[79770]: pgmap v486: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:57:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:43.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:57:43 compute-1 python3.9[202723]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:43 compute-1 sudo[202721]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:44 compute-1 sudo[202874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wethwblnfqcyivvfcduvwftjigohtjap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015063.7657986-549-74938894858800/AnsiballZ_systemd.py'
Dec 06 09:57:44 compute-1 sudo[202874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:44 compute-1 python3.9[202876]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:57:44 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 06 09:57:44 compute-1 systemd[1]: Stopped Load Kernel Modules.
Dec 06 09:57:44 compute-1 systemd[1]: Stopping Load Kernel Modules...
Dec 06 09:57:44 compute-1 systemd[1]: Starting Load Kernel Modules...
Dec 06 09:57:44 compute-1 systemd[1]: Finished Load Kernel Modules.
Dec 06 09:57:44 compute-1 sudo[202874]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:44 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:45.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:45 compute-1 sudo[203030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynvbmsqeglknpsxzhynzffygeevykcaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015065.1138878-573-264552234627295/AnsiballZ_file.py'
Dec 06 09:57:45 compute-1 sudo[203030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:45.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:45 compute-1 ceph-mon[79770]: pgmap v487: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 09:57:45 compute-1 python3.9[203032]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:45 compute-1 sudo[203030]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:46 compute-1 sudo[203183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgkrfwgtkbeokvyywhkioezmptghfxbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015066.1176646-600-107217929034311/AnsiballZ_stat.py'
Dec 06 09:57:46 compute-1 sudo[203183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:46 compute-1 python3.9[203185]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:57:46 compute-1 sudo[203183]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:46 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:47 compute-1 sudo[203335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yabcuhgmkauzxwycvazrpczdfsawqavd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015066.944521-627-116907472754087/AnsiballZ_stat.py'
Dec 06 09:57:47 compute-1 sudo[203335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:47.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:47 compute-1 python3.9[203337]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:57:47 compute-1 sudo[203335]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:47.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:47 compute-1 ceph-mon[79770]: pgmap v488: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:57:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:47 compute-1 sudo[203487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bofujvbacglvhoqyoqsqhbskykuprtvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015067.631713-651-67470341557495/AnsiballZ_stat.py'
Dec 06 09:57:47 compute-1 sudo[203487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:48 compute-1 python3.9[203489]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:48 compute-1 sudo[203487]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:48 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:57:48 compute-1 sudo[203611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sznmmaahmiiabavctiuvmdneazrhlkju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015067.631713-651-67470341557495/AnsiballZ_copy.py'
Dec 06 09:57:48 compute-1 sudo[203611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:57:48 compute-1 python3.9[203613]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015067.631713-651-67470341557495/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:48 compute-1 sudo[203611]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:48 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:49 compute-1 sudo[203719]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:57:49 compute-1 sudo[203719]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:57:49 compute-1 sudo[203719]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:49.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:49.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:49 compute-1 sudo[203788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvyghawekdpgtgxikrkqfwpgubvleexi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015068.9424455-696-274168132772754/AnsiballZ_command.py'
Dec 06 09:57:49 compute-1 sudo[203788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:49 compute-1 ceph-mon[79770]: pgmap v489: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:57:49 compute-1 python3.9[203790]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:57:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:49 compute-1 sudo[203788]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c005110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:50 compute-1 sudo[203941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlveaqtjnjbwzvmvcjohqwsclgfljmdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015069.9590838-720-235079825739687/AnsiballZ_lineinfile.py'
Dec 06 09:57:50 compute-1 sudo[203941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:50 compute-1 python3.9[203944]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:50 compute-1 sudo[203941]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:50 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:51 compute-1 sudo[204094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewfacwrvqzugcoivmhrcfzrsqorimfmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015070.6837072-744-226655707344575/AnsiballZ_replace.py'
Dec 06 09:57:51 compute-1 sudo[204094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:51.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 09:57:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 09:57:51 compute-1 python3.9[204096]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:51 compute-1 sudo[204094]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:51.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:51 compute-1 ceph-mon[79770]: pgmap v490: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:57:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:51 compute-1 sudo[204247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpqdbisxzsfcwfuksnfqypngxctmiuih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015071.6405458-768-134689416367640/AnsiballZ_replace.py'
Dec 06 09:57:51 compute-1 sudo[204247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:52 compute-1 python3.9[204249]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:52 compute-1 sudo[204247]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:52 compute-1 sudo[204400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jivkpnzlecdwwayiwinidkktskjkuarn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015072.4433985-795-89092033961737/AnsiballZ_lineinfile.py'
Dec 06 09:57:52 compute-1 sudo[204400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:52 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:52 compute-1 python3.9[204402]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:52 compute-1 sudo[204400]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:53.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:53 compute-1 sudo[204552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rldtovcmmlevkjnreqrneskqyohupprx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015073.0653865-795-229975360182442/AnsiballZ_lineinfile.py'
Dec 06 09:57:53 compute-1 sudo[204552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:53.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:57:53 compute-1 python3.9[204554]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:53 compute-1 sudo[204552]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:53 compute-1 ceph-mon[79770]: pgmap v491: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:57:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:54 compute-1 sudo[204714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyaemlrmkkczolnpkuuwrlipdtgqfunc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015073.7062476-795-15583865783253/AnsiballZ_lineinfile.py'
Dec 06 09:57:54 compute-1 sudo[204714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:57:54.267 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:57:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:57:54.268 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:57:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:57:54.269 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:57:54 compute-1 podman[204678]: 2025-12-06 09:57:54.305034219 +0000 UTC m=+0.122280545 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 09:57:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:54 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 09:57:54 compute-1 python3.9[204717]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:54 compute-1 sudo[204714]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:57:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:54 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:54 compute-1 sudo[204885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckwkqwlatkbowtkseqtkvsjtiqrsjauh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015074.6099694-795-256969320528293/AnsiballZ_lineinfile.py'
Dec 06 09:57:54 compute-1 sudo[204885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:55 compute-1 python3.9[204887]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:55 compute-1 sudo[204885]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:55.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:55.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:55 compute-1 sudo[205037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frbuvbgfhugbvqhoxmnbmgnltobhznon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015075.631047-882-84925998572632/AnsiballZ_stat.py'
Dec 06 09:57:55 compute-1 sudo[205037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:56 compute-1 ceph-mon[79770]: pgmap v492: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:57:56 compute-1 python3.9[205039]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:57:56 compute-1 sudo[205037]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:56 compute-1 sudo[205192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpfozxcshiitvwwmsnslrbulkrrmxvqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015076.3925672-906-247747212263239/AnsiballZ_file.py'
Dec 06 09:57:56 compute-1 sudo[205192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:56 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:56 compute-1 python3.9[205194]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:57:56 compute-1 sudo[205192]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:56 compute-1 ceph-mon[79770]: pgmap v493: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:57:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:57.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:57.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:57 compute-1 sudo[205344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhintpfmhfaigleknjzrysokursylqgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015077.350641-933-43005541640321/AnsiballZ_file.py'
Dec 06 09:57:57 compute-1 sudo[205344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:57 compute-1 python3.9[205346]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:57 compute-1 sudo[205344]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:57:58 compute-1 sudo[205378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:57:58 compute-1 sudo[205378]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:57:58 compute-1 sudo[205378]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:58 compute-1 sudo[205455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:57:58 compute-1 sudo[205455]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:57:58 compute-1 podman[205427]: 2025-12-06 09:57:58.592340384 +0000 UTC m=+0.052440182 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 09:57:58 compute-1 sudo[205566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tewypylqvpisuseyxdfnjfyxmbziwexf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015078.4947612-957-256256857130187/AnsiballZ_stat.py'
Dec 06 09:57:58 compute-1 sudo[205566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:58 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:58 compute-1 python3.9[205570]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:57:59 compute-1 sudo[205566]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:59 compute-1 sudo[205455]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:59 compute-1 sudo[205674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfblhboyqztuevurfedkczloflvkajor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015078.4947612-957-256256857130187/AnsiballZ_file.py'
Dec 06 09:57:59 compute-1 sudo[205674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:59 compute-1 ceph-mon[79770]: pgmap v494: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 09:57:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:57:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:57:59.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:57:59 compute-1 python3.9[205676]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:57:59 compute-1 sudo[205674]: pam_unix(sudo:session): session closed for user root
Dec 06 09:57:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:57:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:57:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:57:59.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:57:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:57:59 compute-1 sudo[205826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aicqkkugspxhfbkyjyvuflfrewpgabwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015079.5764852-957-265150780697082/AnsiballZ_stat.py'
Dec 06 09:57:59 compute-1 sudo[205826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:57:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:57:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:00 compute-1 python3.9[205828]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:58:00 compute-1 sudo[205826]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:00 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:58:00 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:58:00 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:58:00 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:58:00 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:58:00 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:58:00 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:58:00 compute-1 sudo[205905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbsqiuyiaxmgxuoybzjldbyquonnpxoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015079.5764852-957-265150780697082/AnsiballZ_file.py'
Dec 06 09:58:00 compute-1 sudo[205905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:00 compute-1 python3.9[205907]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:58:00 compute-1 sudo[205905]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:00 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095800 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 09:58:01 compute-1 ceph-mon[79770]: pgmap v495: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:58:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.003000075s ======
Dec 06 09:58:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:01.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000075s
Dec 06 09:58:01 compute-1 sudo[206057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uslelkqdnmcoeriuhiwcgghnekeqhyhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015081.143283-1026-199087906019126/AnsiballZ_file.py'
Dec 06 09:58:01 compute-1 sudo[206057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:01.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:01 compute-1 python3.9[206059]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:01 compute-1 sudo[206057]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:02 compute-1 sudo[206209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnxmnwmbpyycjcodsignspjgheneyebx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015081.9506862-1051-254157287462803/AnsiballZ_stat.py'
Dec 06 09:58:02 compute-1 sudo[206209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:02 compute-1 python3.9[206211]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:58:02 compute-1 sudo[206209]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:02 compute-1 sudo[206288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itwrpvssecxclhkvxtyktvqdlwteaole ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015081.9506862-1051-254157287462803/AnsiballZ_file.py'
Dec 06 09:58:02 compute-1 sudo[206288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:02 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:03 compute-1 python3.9[206290]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:03 compute-1 sudo[206288]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:03 compute-1 ceph-mon[79770]: pgmap v496: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:58:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:03.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:03.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:58:03 compute-1 sudo[206440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqwqyqwjuudancorekgqibnpqqjjxbzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015083.3807817-1086-210782684125737/AnsiballZ_stat.py'
Dec 06 09:58:03 compute-1 sudo[206440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:03 compute-1 python3.9[206442]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:58:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:03 compute-1 sudo[206440]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:04 compute-1 sudo[206518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saxyzstkwzquoieuvmxuspryjqeuoxgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015083.3807817-1086-210782684125737/AnsiballZ_file.py'
Dec 06 09:58:04 compute-1 sudo[206518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:04 compute-1 python3.9[206520]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:04 compute-1 sudo[206518]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:04 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:05 compute-1 sudo[206671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvytxbrchlmusjikjtqfftaedhfoimqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015084.7737825-1122-170549064533533/AnsiballZ_systemd.py'
Dec 06 09:58:05 compute-1 sudo[206671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:05 compute-1 sudo[206674]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:58:05 compute-1 sudo[206674]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:58:05 compute-1 sudo[206674]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:05.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:05 compute-1 ceph-mon[79770]: pgmap v497: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 09:58:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:58:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:58:05 compute-1 python3.9[206673]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:58:05 compute-1 systemd[1]: Reloading.
Dec 06 09:58:05 compute-1 systemd-rc-local-generator[206724]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:58:05 compute-1 systemd-sysv-generator[206727]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:58:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:05.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:05 compute-1 sudo[206671]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0002550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:06 compute-1 sudo[206885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmtopjzgyayonnvupbulojvsnywoyvte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015086.1763136-1146-181641210531432/AnsiballZ_stat.py'
Dec 06 09:58:06 compute-1 sudo[206885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:06 compute-1 python3.9[206887]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:58:06 compute-1 sudo[206885]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:06 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:07 compute-1 sudo[206963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnrqebbhwutdzqtvoivwplpefgztlvje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015086.1763136-1146-181641210531432/AnsiballZ_file.py'
Dec 06 09:58:07 compute-1 sudo[206963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:07 compute-1 python3.9[206965]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:07 compute-1 sudo[206963]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:07.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:07.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:07 compute-1 ceph-mon[79770]: pgmap v498: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:58:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:08 compute-1 sudo[207115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpsnattsiifdoccixsdogkgevryhkxkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015087.793467-1182-173490448944530/AnsiballZ_stat.py'
Dec 06 09:58:08 compute-1 sudo[207115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:08 compute-1 python3.9[207117]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:58:08 compute-1 sudo[207115]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:08 compute-1 sudo[207194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrsqhcnhidqxiiyfahakzovzktayapvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015087.793467-1182-173490448944530/AnsiballZ_file.py'
Dec 06 09:58:08 compute-1 sudo[207194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:58:08 compute-1 python3.9[207196]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:08 compute-1 sudo[207194]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:08 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:09 compute-1 sudo[207257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:58:09 compute-1 sudo[207257]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:58:09 compute-1 sudo[207257]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:09.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:09.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:09 compute-1 sudo[207371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxxjvzetwotpccvmxmljyowkekebgqyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015089.217964-1218-182584434266011/AnsiballZ_systemd.py'
Dec 06 09:58:09 compute-1 sudo[207371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:09 compute-1 python3.9[207373]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:58:09 compute-1 systemd[1]: Reloading.
Dec 06 09:58:09 compute-1 ceph-mon[79770]: pgmap v499: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:58:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:58:09 compute-1 systemd-sysv-generator[207405]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:58:09 compute-1 systemd-rc-local-generator[207402]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:58:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e8002e70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:10 compute-1 systemd[1]: Starting Create netns directory...
Dec 06 09:58:10 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 06 09:58:10 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 06 09:58:10 compute-1 systemd[1]: Finished Create netns directory.
Dec 06 09:58:10 compute-1 sudo[207371]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:10 compute-1 ceph-mon[79770]: pgmap v500: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:10 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:11 compute-1 sudo[207567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdsjqlwhswvnnjaeahhlxrzquvsyczbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015090.9147146-1248-153464905393152/AnsiballZ_file.py'
Dec 06 09:58:11 compute-1 sudo[207567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:11.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:11 compute-1 python3.9[207569]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:58:11 compute-1 sudo[207567]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:11.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:12 compute-1 sudo[207719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pveiladeygceazdqodqmigrgnexqaiux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015091.7797637-1272-180924122192738/AnsiballZ_stat.py'
Dec 06 09:58:12 compute-1 sudo[207719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:12 compute-1 python3.9[207721]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:58:12 compute-1 sudo[207719]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:12 compute-1 sudo[207843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzscstyhmcbrewpylntoegqxubmmmyzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015091.7797637-1272-180924122192738/AnsiballZ_copy.py'
Dec 06 09:58:12 compute-1 sudo[207843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:12 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:12 compute-1 python3.9[207845]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015091.7797637-1272-180924122192738/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:58:12 compute-1 sudo[207843]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:13 compute-1 ceph-mon[79770]: pgmap v501: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:13.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:13.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:58:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:13 compute-1 sudo[207995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blqcitmpxncgmhrmpwwzlqbfzsywwvny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015093.559972-1323-174388956748911/AnsiballZ_file.py'
Dec 06 09:58:13 compute-1 sudo[207995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:14 compute-1 python3.9[207997]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:58:14 compute-1 sudo[207995]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:14 compute-1 sudo[208148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skzhsrisnophcscwlvaajsndwsqysohq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015094.4522076-1347-5478813197091/AnsiballZ_stat.py'
Dec 06 09:58:14 compute-1 sudo[208148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:14 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:14 compute-1 python3.9[208150]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:58:14 compute-1 sudo[208148]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:15 compute-1 ceph-mon[79770]: pgmap v502: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:15.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:15 compute-1 sudo[208271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uycxdombxpazxaohkqdcipkaozqanboh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015094.4522076-1347-5478813197091/AnsiballZ_copy.py'
Dec 06 09:58:15 compute-1 sudo[208271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:15.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:15 compute-1 python3.9[208273]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015094.4522076-1347-5478813197091/.source.json _original_basename=.5mlf3od5 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:15 compute-1 sudo[208271]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:16 compute-1 sudo[208423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmkklqdrjiuwjqpoeqixbotqlhtaobcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015095.9060314-1392-255814764871841/AnsiballZ_file.py'
Dec 06 09:58:16 compute-1 sudo[208423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:16 compute-1 python3.9[208425]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:16 compute-1 sudo[208423]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:16 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:17 compute-1 sudo[208576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gabwgjmivuzzxxsslipnhnarzpprpgrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015096.722149-1416-254945246466187/AnsiballZ_stat.py'
Dec 06 09:58:17 compute-1 sudo[208576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:17 compute-1 sudo[208576]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:17.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:17 compute-1 ceph-mon[79770]: pgmap v503: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:17.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:17 compute-1 sudo[208699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcmclfjtdbpikcwfyhthyszuhavptzmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015096.722149-1416-254945246466187/AnsiballZ_copy.py'
Dec 06 09:58:17 compute-1 sudo[208699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:17 compute-1 sudo[208699]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:58:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:18 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:19 compute-1 sudo[208852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szngdyuxfbvicgsikjetueynzgxmbliz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015098.55167-1467-242539580417382/AnsiballZ_container_config_data.py'
Dec 06 09:58:19 compute-1 sudo[208852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:19 compute-1 python3.9[208854]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 06 09:58:19 compute-1 sudo[208852]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:19.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:19 compute-1 ceph-mon[79770]: pgmap v504: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:58:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:58:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:19.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:58:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:19 compute-1 sudo[209004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvyxjmoosnaimearzikewdgigqfjciyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015099.5813224-1494-47840090898643/AnsiballZ_container_config_hash.py'
Dec 06 09:58:19 compute-1 sudo[209004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:20 compute-1 python3.9[209006]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 09:58:20 compute-1 sudo[209004]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:20 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f00042e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:21 compute-1 sudo[209157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vryqcmojrkpzfipsrslvbveopyoblijb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015100.609211-1521-86754695197890/AnsiballZ_podman_container_info.py'
Dec 06 09:58:21 compute-1 sudo[209157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:21 compute-1 python3.9[209159]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 06 09:58:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:21.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:21 compute-1 ceph-mon[79770]: pgmap v505: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:21.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:21 compute-1 sudo[209157]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:22 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:23 compute-1 sudo[209337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdzmltjfsawfevndntyskdvgvitafcna ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765015102.6445568-1560-182692489378865/AnsiballZ_edpm_container_manage.py'
Dec 06 09:58:23 compute-1 sudo[209337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:23.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:23 compute-1 ceph-mon[79770]: pgmap v506: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:23 compute-1 python3[209339]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 09:58:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:58:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:23.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0004300 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c003690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:24 compute-1 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 06 09:58:24 compute-1 podman[209354]: 2025-12-06 09:58:24.530125812 +0000 UTC m=+0.998665165 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842
Dec 06 09:58:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:58:24 compute-1 podman[209392]: 2025-12-06 09:58:24.571506537 +0000 UTC m=+0.199792007 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 09:58:24 compute-1 podman[209441]: 2025-12-06 09:58:24.700414928 +0000 UTC m=+0.060839167 container create ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Dec 06 09:58:24 compute-1 podman[209441]: 2025-12-06 09:58:24.669977485 +0000 UTC m=+0.030401744 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842
Dec 06 09:58:24 compute-1 python3[209339]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842
Dec 06 09:58:24 compute-1 sudo[209337]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:24 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:25.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:25.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:25 compute-1 ceph-mon[79770]: pgmap v507: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:25 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 06 09:58:25 compute-1 sudo[209630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdpyjbueifeekymmegtcvrsuobnkrazc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015105.5194209-1584-248521203274459/AnsiballZ_stat.py'
Dec 06 09:58:25 compute-1 sudo[209630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0004300 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:26 compute-1 python3.9[209632]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:58:26 compute-1 sudo[209630]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:26 compute-1 sudo[209785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yifqtfjozkylrnkevxrghpctywzzidjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015106.4309537-1611-176339745868902/AnsiballZ_file.py'
Dec 06 09:58:26 compute-1 sudo[209785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:26 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:26 compute-1 python3.9[209787]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:26 compute-1 sudo[209785]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:27 compute-1 sudo[209861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eodgomgsnyqwtsetjnnumnaxgsmlfhdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015106.4309537-1611-176339745868902/AnsiballZ_stat.py'
Dec 06 09:58:27 compute-1 sudo[209861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:58:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:27.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:58:27 compute-1 python3.9[209863]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:58:27 compute-1 sudo[209861]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:27.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:27 compute-1 ceph-mon[79770]: pgmap v508: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:27 compute-1 sudo[210012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjzqtxzhohrcuvumsobudnqunlufhpgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015107.5045774-1611-254324911939266/AnsiballZ_copy.py'
Dec 06 09:58:27 compute-1 sudo[210012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:28 compute-1 python3.9[210014]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765015107.5045774-1611-254324911939266/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:28 compute-1 sudo[210012]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:28 compute-1 sudo[210089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsddvskxpiyqqctcadxgqzqwcnngvcdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015107.5045774-1611-254324911939266/AnsiballZ_systemd.py'
Dec 06 09:58:28 compute-1 sudo[210089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:58:28 compute-1 python3.9[210091]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:58:28 compute-1 systemd[1]: Reloading.
Dec 06 09:58:28 compute-1 podman[210092]: 2025-12-06 09:58:28.797164551 +0000 UTC m=+0.090611994 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:58:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:28 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0004320 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:28 compute-1 systemd-rc-local-generator[210141]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:58:28 compute-1 systemd-sysv-generator[210146]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:58:29 compute-1 sudo[210089]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:29.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:29 compute-1 sudo[210172]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:58:29 compute-1 sudo[210172]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:58:29 compute-1 sudo[210172]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:29 compute-1 sudo[210246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dixxwrsissvdgvuakkznpsdeldjubycz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015107.5045774-1611-254324911939266/AnsiballZ_systemd.py'
Dec 06 09:58:29 compute-1 sudo[210246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:29.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:29 compute-1 ceph-mon[79770]: pgmap v509: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:58:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:29 compute-1 python3.9[210248]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:58:29 compute-1 systemd[1]: Reloading.
Dec 06 09:58:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:29 compute-1 systemd-sysv-generator[210285]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:58:29 compute-1 systemd-rc-local-generator[210282]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:58:30 compute-1 systemd[1]: Starting multipathd container...
Dec 06 09:58:30 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:58:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/730c45c7dd8dcda876f2bc17b3c61b25832c8554cbb3bcfa917da2d02fcaf626/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/730c45c7dd8dcda876f2bc17b3c61b25832c8554cbb3bcfa917da2d02fcaf626/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:30 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59.
Dec 06 09:58:30 compute-1 podman[210293]: 2025-12-06 09:58:30.413324753 +0000 UTC m=+0.144929330 container init ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd)
Dec 06 09:58:30 compute-1 multipathd[210308]: + sudo -E kolla_set_configs
Dec 06 09:58:30 compute-1 podman[210293]: 2025-12-06 09:58:30.442046434 +0000 UTC m=+0.173650971 container start ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Dec 06 09:58:30 compute-1 podman[210293]: multipathd
Dec 06 09:58:30 compute-1 systemd[1]: Started multipathd container.
Dec 06 09:58:30 compute-1 sudo[210315]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 09:58:30 compute-1 sudo[210315]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:58:30 compute-1 sudo[210315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 09:58:30 compute-1 sudo[210246]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:30 compute-1 multipathd[210308]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:58:30 compute-1 multipathd[210308]: INFO:__main__:Validating config file
Dec 06 09:58:30 compute-1 multipathd[210308]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:58:30 compute-1 multipathd[210308]: INFO:__main__:Writing out command to execute
Dec 06 09:58:30 compute-1 podman[210314]: 2025-12-06 09:58:30.554331524 +0000 UTC m=+0.100692804 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 09:58:30 compute-1 sudo[210315]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:30 compute-1 multipathd[210308]: ++ cat /run_command
Dec 06 09:58:30 compute-1 systemd[1]: ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59-ebceffeba97e509.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:58:30 compute-1 systemd[1]: ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59-ebceffeba97e509.service: Failed with result 'exit-code'.
Dec 06 09:58:30 compute-1 multipathd[210308]: + CMD='/usr/sbin/multipathd -d'
Dec 06 09:58:30 compute-1 multipathd[210308]: + ARGS=
Dec 06 09:58:30 compute-1 multipathd[210308]: + sudo kolla_copy_cacerts
Dec 06 09:58:30 compute-1 sudo[210360]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 06 09:58:30 compute-1 sudo[210360]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:58:30 compute-1 sudo[210360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 09:58:30 compute-1 sudo[210360]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:30 compute-1 multipathd[210308]: + [[ ! -n '' ]]
Dec 06 09:58:30 compute-1 multipathd[210308]: + . kolla_extend_start
Dec 06 09:58:30 compute-1 multipathd[210308]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 06 09:58:30 compute-1 multipathd[210308]: Running command: '/usr/sbin/multipathd -d'
Dec 06 09:58:30 compute-1 multipathd[210308]: + umask 0022
Dec 06 09:58:30 compute-1 multipathd[210308]: + exec /usr/sbin/multipathd -d
Dec 06 09:58:30 compute-1 multipathd[210308]: 3483.291667 | --------start up--------
Dec 06 09:58:30 compute-1 multipathd[210308]: 3483.291874 | read /etc/multipath.conf
Dec 06 09:58:30 compute-1 multipathd[210308]: 3483.300140 | path checkers start up
Dec 06 09:58:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:31.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:31 compute-1 python3.9[210495]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:58:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:31.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:31 compute-1 ceph-mon[79770]: pgmap v510: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:31 compute-1 sudo[210647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozogqwribfuydrriogoupkohxxgovslv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015111.6280618-1719-180306589661693/AnsiballZ_command.py'
Dec 06 09:58:31 compute-1 sudo[210647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:32 compute-1 python3.9[210649]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:58:32 compute-1 sudo[210647]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:32 compute-1 sudo[210813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjaqyfegkmvnlzzhqckdvnouuiyrnybd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015112.4785104-1743-225696944332001/AnsiballZ_systemd.py'
Dec 06 09:58:32 compute-1 sudo[210813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:32 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:33 compute-1 python3.9[210815]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:58:33 compute-1 systemd[1]: Stopping multipathd container...
Dec 06 09:58:33 compute-1 multipathd[210308]: 3485.965419 | exit (signal)
Dec 06 09:58:33 compute-1 multipathd[210308]: 3485.965533 | --------shut down-------
Dec 06 09:58:33 compute-1 systemd[1]: libpod-ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59.scope: Deactivated successfully.
Dec 06 09:58:33 compute-1 podman[210819]: 2025-12-06 09:58:33.321704337 +0000 UTC m=+0.090180394 container died ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec 06 09:58:33 compute-1 systemd[1]: ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59-ebceffeba97e509.timer: Deactivated successfully.
Dec 06 09:58:33 compute-1 systemd[1]: Stopped /usr/bin/podman healthcheck run ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59.
Dec 06 09:58:33 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59-userdata-shm.mount: Deactivated successfully.
Dec 06 09:58:33 compute-1 systemd[1]: var-lib-containers-storage-overlay-730c45c7dd8dcda876f2bc17b3c61b25832c8554cbb3bcfa917da2d02fcaf626-merged.mount: Deactivated successfully.
Dec 06 09:58:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:33.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:58:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:33.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:33 compute-1 podman[210819]: 2025-12-06 09:58:33.560055637 +0000 UTC m=+0.328531724 container cleanup ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:58:33 compute-1 podman[210819]: multipathd
Dec 06 09:58:33 compute-1 podman[210847]: multipathd
Dec 06 09:58:33 compute-1 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 06 09:58:33 compute-1 systemd[1]: Stopped multipathd container.
Dec 06 09:58:33 compute-1 systemd[1]: Starting multipathd container...
Dec 06 09:58:33 compute-1 systemd[1]: Started libcrun container.
Dec 06 09:58:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/730c45c7dd8dcda876f2bc17b3c61b25832c8554cbb3bcfa917da2d02fcaf626/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/730c45c7dd8dcda876f2bc17b3c61b25832c8554cbb3bcfa917da2d02fcaf626/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 09:58:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:33 compute-1 ceph-mon[79770]: pgmap v511: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:33 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59.
Dec 06 09:58:33 compute-1 podman[210861]: 2025-12-06 09:58:33.770531427 +0000 UTC m=+0.118590887 container init ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 09:58:33 compute-1 multipathd[210877]: + sudo -E kolla_set_configs
Dec 06 09:58:33 compute-1 sudo[210883]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 06 09:58:33 compute-1 podman[210861]: 2025-12-06 09:58:33.799193617 +0000 UTC m=+0.147253057 container start ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 09:58:33 compute-1 sudo[210883]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:58:33 compute-1 sudo[210883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 09:58:33 compute-1 podman[210861]: multipathd
Dec 06 09:58:33 compute-1 systemd[1]: Started multipathd container.
Dec 06 09:58:33 compute-1 multipathd[210877]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 09:58:33 compute-1 multipathd[210877]: INFO:__main__:Validating config file
Dec 06 09:58:33 compute-1 multipathd[210877]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 09:58:33 compute-1 multipathd[210877]: INFO:__main__:Writing out command to execute
Dec 06 09:58:33 compute-1 sudo[210883]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:33 compute-1 sudo[210813]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:33 compute-1 multipathd[210877]: ++ cat /run_command
Dec 06 09:58:33 compute-1 multipathd[210877]: + CMD='/usr/sbin/multipathd -d'
Dec 06 09:58:33 compute-1 multipathd[210877]: + ARGS=
Dec 06 09:58:33 compute-1 multipathd[210877]: + sudo kolla_copy_cacerts
Dec 06 09:58:33 compute-1 sudo[210905]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 06 09:58:33 compute-1 sudo[210905]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 06 09:58:33 compute-1 sudo[210905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 06 09:58:33 compute-1 sudo[210905]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:33 compute-1 multipathd[210877]: + [[ ! -n '' ]]
Dec 06 09:58:33 compute-1 multipathd[210877]: + . kolla_extend_start
Dec 06 09:58:33 compute-1 multipathd[210877]: Running command: '/usr/sbin/multipathd -d'
Dec 06 09:58:33 compute-1 multipathd[210877]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 06 09:58:33 compute-1 multipathd[210877]: + umask 0022
Dec 06 09:58:33 compute-1 multipathd[210877]: + exec /usr/sbin/multipathd -d
Dec 06 09:58:33 compute-1 podman[210884]: 2025-12-06 09:58:33.890024086 +0000 UTC m=+0.077315325 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 09:58:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:33 compute-1 systemd[1]: ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59-4676a25146bf9417.service: Main process exited, code=exited, status=1/FAILURE
Dec 06 09:58:33 compute-1 systemd[1]: ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59-4676a25146bf9417.service: Failed with result 'exit-code'.
Dec 06 09:58:33 compute-1 multipathd[210877]: 3486.575341 | --------start up--------
Dec 06 09:58:33 compute-1 multipathd[210877]: 3486.575366 | read /etc/multipath.conf
Dec 06 09:58:33 compute-1 multipathd[210877]: 3486.582330 | path checkers start up
Dec 06 09:58:34 compute-1 sudo[211068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwclmkncakosqikhiweowbmktqzqwhwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015114.3005598-1767-88585244162284/AnsiballZ_file.py'
Dec 06 09:58:34 compute-1 sudo[211068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:34 compute-1 python3.9[211070]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:34 compute-1 sudo[211068]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:34 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:58:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:35.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:58:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:35.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:35 compute-1 ceph-mon[79770]: pgmap v512: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:35 compute-1 sudo[211220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyjqfnqzvqacdpvfxayavhigxyvdczak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015115.501698-1803-64160692348378/AnsiballZ_file.py'
Dec 06 09:58:35 compute-1 sudo[211220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:36 compute-1 python3.9[211222]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 06 09:58:36 compute-1 sudo[211220]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:36 compute-1 sudo[211373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caaelakkcykeqziydskpyjquykuvwrcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015116.3624384-1827-11303511144110/AnsiballZ_modprobe.py'
Dec 06 09:58:36 compute-1 sudo[211373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:36 compute-1 ceph-mon[79770]: pgmap v513: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:36 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 06 09:58:36 compute-1 systemd[1]: virtqemud.service: Deactivated successfully.
Dec 06 09:58:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:36 compute-1 python3.9[211375]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 06 09:58:36 compute-1 kernel: Key type psk registered
Dec 06 09:58:36 compute-1 sudo[211373]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:37.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:37 compute-1 sudo[211538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyohmdzprpilmhareixfcdbhnhgvltri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015117.184472-1851-204537384215914/AnsiballZ_stat.py'
Dec 06 09:58:37 compute-1 sudo[211538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:37.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:37 compute-1 python3.9[211540]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:58:37 compute-1 sudo[211538]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:38 compute-1 sudo[211661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poeufgovigpaivkxadgchfsjdbtqavby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015117.184472-1851-204537384215914/AnsiballZ_copy.py'
Dec 06 09:58:38 compute-1 sudo[211661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:38 compute-1 python3.9[211663]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765015117.184472-1851-204537384215914/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:38 compute-1 sudo[211661]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:58:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:38 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:39 compute-1 sudo[211814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiehybuvfwntomruznmllyhoscovstco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015118.9464538-1899-168839334170436/AnsiballZ_lineinfile.py'
Dec 06 09:58:39 compute-1 sudo[211814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:39 compute-1 ceph-mon[79770]: pgmap v514: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:58:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:58:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:58:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:39.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:58:39 compute-1 python3.9[211816]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:39 compute-1 sudo[211814]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:39.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4002690 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.199019) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120199358, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1328, "num_deletes": 256, "total_data_size": 3264946, "memory_usage": 3311728, "flush_reason": "Manual Compaction"}
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120217192, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2138927, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18148, "largest_seqno": 19471, "table_properties": {"data_size": 2133279, "index_size": 3039, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11449, "raw_average_key_size": 18, "raw_value_size": 2121946, "raw_average_value_size": 3455, "num_data_blocks": 137, "num_entries": 614, "num_filter_entries": 614, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015004, "oldest_key_time": 1765015004, "file_creation_time": 1765015120, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 18446 microseconds, and 10038 cpu microseconds.
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.217492) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2138927 bytes OK
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.217626) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.220107) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.220257) EVENT_LOG_v1 {"time_micros": 1765015120220231, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.220325) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 3258646, prev total WAL file size 3258646, number of live WAL files 2.
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.222917) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323533' seq:0, type:0; will stop at (end)
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(2088KB)], [33(11MB)]
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120223064, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 14473186, "oldest_snapshot_seqno": -1}
Dec 06 09:58:40 compute-1 sudo[211966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnymvwryssgsscfvksionwrdmivrhpvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015119.8633235-1923-157408746944454/AnsiballZ_systemd.py'
Dec 06 09:58:40 compute-1 sudo[211966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4958 keys, 13987457 bytes, temperature: kUnknown
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120338652, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13987457, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13952617, "index_size": 21354, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12421, "raw_key_size": 126161, "raw_average_key_size": 25, "raw_value_size": 13860895, "raw_average_value_size": 2795, "num_data_blocks": 876, "num_entries": 4958, "num_filter_entries": 4958, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015120, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.339049) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13987457 bytes
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.340389) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 125.1 rd, 120.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 11.8 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(13.3) write-amplify(6.5) OK, records in: 5484, records dropped: 526 output_compression: NoCompression
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.340409) EVENT_LOG_v1 {"time_micros": 1765015120340398, "job": 18, "event": "compaction_finished", "compaction_time_micros": 115731, "compaction_time_cpu_micros": 35985, "output_level": 6, "num_output_files": 1, "total_output_size": 13987457, "num_input_records": 5484, "num_output_records": 4958, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120340867, "job": 18, "event": "table_file_deletion", "file_number": 35}
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015120343107, "job": 18, "event": "table_file_deletion", "file_number": 33}
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.222789) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.343305) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.343314) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.343316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.343318) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:58:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-09:58:40.343320) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 09:58:40 compute-1 python3.9[211969]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:58:40 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 06 09:58:40 compute-1 systemd[1]: Stopped Load Kernel Modules.
Dec 06 09:58:40 compute-1 systemd[1]: Stopping Load Kernel Modules...
Dec 06 09:58:40 compute-1 systemd[1]: Starting Load Kernel Modules...
Dec 06 09:58:40 compute-1 systemd[1]: Finished Load Kernel Modules.
Dec 06 09:58:40 compute-1 sudo[211966]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:40 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:41 compute-1 ceph-mon[79770]: pgmap v515: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:41 compute-1 sudo[212123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfgnqkcqwrgjizsngydxvkduovgzusad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015121.0058215-1947-135939968907556/AnsiballZ_dnf.py'
Dec 06 09:58:41 compute-1 sudo[212123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:41.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:41.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:41 compute-1 python3.9[212125]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 06 09:58:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:42 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:43 compute-1 ceph-mon[79770]: pgmap v516: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:58:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:43.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:58:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:58:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:58:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:43.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:58:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:44 compute-1 systemd[1]: Reloading.
Dec 06 09:58:44 compute-1 systemd-sysv-generator[212161]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:58:44 compute-1 systemd-rc-local-generator[212156]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:58:44 compute-1 systemd[1]: Reloading.
Dec 06 09:58:44 compute-1 systemd-rc-local-generator[212191]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:58:44 compute-1 systemd-sysv-generator[212196]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:58:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:44 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:44 compute-1 systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 06 09:58:44 compute-1 systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 06 09:58:45 compute-1 lvm[212240]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 09:58:45 compute-1 lvm[212240]: VG ceph_vg0 finished
Dec 06 09:58:45 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 06 09:58:45 compute-1 systemd[1]: Starting man-db-cache-update.service...
Dec 06 09:58:45 compute-1 systemd[1]: Reloading.
Dec 06 09:58:45 compute-1 systemd-sysv-generator[212293]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:58:45 compute-1 systemd-rc-local-generator[212289]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:58:45 compute-1 ceph-mon[79770]: pgmap v517: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:45.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:45 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 06 09:58:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:45.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:45 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:46 compute-1 sudo[212123]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:46 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 06 09:58:46 compute-1 systemd[1]: Finished man-db-cache-update.service.
Dec 06 09:58:46 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.774s CPU time.
Dec 06 09:58:46 compute-1 systemd[1]: run-ra10d1b80ac27411e8f7c82adae0528ec.service: Deactivated successfully.
Dec 06 09:58:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:46 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:47 compute-1 sudo[213582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuwmehlrnylpeyhihrzdipsabokubfke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015126.784896-1971-278025385040302/AnsiballZ_systemd_service.py'
Dec 06 09:58:47 compute-1 sudo[213582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:47.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:47 compute-1 ceph-mon[79770]: pgmap v518: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:47.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:47 compute-1 python3.9[213584]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 09:58:47 compute-1 systemd[1]: Stopping Open-iSCSI...
Dec 06 09:58:47 compute-1 iscsid[201683]: iscsid shutting down.
Dec 06 09:58:47 compute-1 systemd[1]: iscsid.service: Deactivated successfully.
Dec 06 09:58:47 compute-1 systemd[1]: Stopped Open-iSCSI.
Dec 06 09:58:47 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 06 09:58:47 compute-1 systemd[1]: Starting Open-iSCSI...
Dec 06 09:58:47 compute-1 systemd[1]: Started Open-iSCSI.
Dec 06 09:58:47 compute-1 sudo[213582]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:47 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:58:48 compute-1 python3.9[213739]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 06 09:58:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:48 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:49.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:49 compute-1 ceph-mon[79770]: pgmap v519: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:58:49 compute-1 sudo[213850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:58:49 compute-1 sudo[213850]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:58:49 compute-1 sudo[213850]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:49.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:49 compute-1 sudo[213918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gicsaahywwugxrzrwruminwyhusmtlir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015129.2418664-2023-126999544569879/AnsiballZ_file.py'
Dec 06 09:58:49 compute-1 sudo[213918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:49 compute-1 python3.9[213920]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:58:49 compute-1 sudo[213918]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:49 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:50 compute-1 sudo[214071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipocpencgjjtmryhfpjuywzttdhbhcgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015130.3958051-2056-41078130617774/AnsiballZ_systemd_service.py'
Dec 06 09:58:50 compute-1 sudo[214071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:50 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:50 compute-1 python3.9[214073]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:58:51 compute-1 systemd[1]: Reloading.
Dec 06 09:58:51 compute-1 systemd-rc-local-generator[214099]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:58:51 compute-1 systemd-sysv-generator[214103]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:58:51 compute-1 sudo[214071]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:51.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:51 compute-1 ceph-mon[79770]: pgmap v520: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:51.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:51 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:52 compute-1 python3.9[214257]: ansible-ansible.builtin.service_facts Invoked
Dec 06 09:58:52 compute-1 network[214274]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 06 09:58:52 compute-1 network[214275]: 'network-scripts' will be removed from distribution in near future.
Dec 06 09:58:52 compute-1 network[214276]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 06 09:58:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:52 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:53.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:58:53 compute-1 ceph-mon[79770]: pgmap v521: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:53.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:53 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:58:54.268 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:58:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:58:54.270 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:58:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:58:54.270 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:58:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:58:54 compute-1 podman[214360]: 2025-12-06 09:58:54.826847351 +0000 UTC m=+0.120447643 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller)
Dec 06 09:58:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:54 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:55.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:55.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:55 compute-1 ceph-mon[79770]: pgmap v522: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:55 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:56 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:57.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:58:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:57.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:58:57 compute-1 ceph-mon[79770]: pgmap v523: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:58:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:57 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:58:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:58 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:59 compute-1 sudo[214579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqzvevkfietlwkftoxffwfyijemkxxrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015138.7812579-2113-263474367165086/AnsiballZ_systemd_service.py'
Dec 06 09:58:59 compute-1 sudo[214579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:58:59 compute-1 python3.9[214581]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:58:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:58:59.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:59 compute-1 sudo[214579]: pam_unix(sudo:session): session closed for user root
Dec 06 09:58:59 compute-1 podman[214583]: 2025-12-06 09:58:59.504041864 +0000 UTC m=+0.067571503 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 09:58:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:58:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:58:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:58:59.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:58:59 compute-1 ceph-mon[79770]: pgmap v524: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:58:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2140037d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:59 compute-1 sudo[214751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djdjchtkdqxtzisxvlcniabqfjqvqpsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015139.6156945-2113-165922426430955/AnsiballZ_systemd_service.py'
Dec 06 09:58:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:58:59 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_25] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:58:59 compute-1 sudo[214751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:00 compute-1 python3.9[214753]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:59:00 compute-1 sudo[214751]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:00 compute-1 sudo[214905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dozjmaiglafafwjrpjbkgccsfdzvnsyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015140.4004571-2113-137896965516109/AnsiballZ_systemd_service.py'
Dec 06 09:59:00 compute-1 sudo[214905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:00 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:00 compute-1 python3.9[214907]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:59:01 compute-1 sudo[214905]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:01.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:59:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:01.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:59:01 compute-1 sudo[215060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxwnduppoucnkbwpjmklvaykgyuqlejz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015141.1910448-2113-184999410191084/AnsiballZ_systemd_service.py'
Dec 06 09:59:01 compute-1 ceph-mon[79770]: pgmap v525: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:01 compute-1 sudo[215060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:01 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4003700 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:02 compute-1 python3.9[215062]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:59:02 compute-1 sudo[215060]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:02 compute-1 sudo[215214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sajaidmrrlztrdegbkmuozpoatsqoyvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015142.197156-2113-200704424252191/AnsiballZ_systemd_service.py'
Dec 06 09:59:02 compute-1 sudo[215214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:02 compute-1 python3.9[215216]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:59:02 compute-1 sudo[215214]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:02 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:02 compute-1 ceph-mon[79770]: pgmap v526: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:03 compute-1 sudo[215367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgbshwdcodrgklpuvcesznsewnbxojvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015143.0327952-2113-233191794531446/AnsiballZ_systemd_service.py'
Dec 06 09:59:03 compute-1 sudo[215367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:03.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:59:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:03.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:03 compute-1 python3.9[215369]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:59:03 compute-1 sudo[215367]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:03 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb2180012b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:04 compute-1 sudo[215530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzcallvadtlpssdbexmlyyrjwzwuxtba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015143.8269956-2113-170984488246002/AnsiballZ_systemd_service.py'
Dec 06 09:59:04 compute-1 sudo[215530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:04 compute-1 podman[215494]: 2025-12-06 09:59:04.183104595 +0000 UTC m=+0.077848588 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 09:59:04 compute-1 python3.9[215537]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:59:04 compute-1 sudo[215530]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:04 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:04 compute-1 sudo[215694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqatuurpqvwrxwzafhuqjgowoeytqvml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015144.6750848-2113-152445260182500/AnsiballZ_systemd_service.py'
Dec 06 09:59:04 compute-1 sudo[215694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:05 compute-1 python3.9[215696]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 09:59:05 compute-1 sudo[215694]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:05 compute-1 ceph-mon[79770]: pgmap v527: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:59:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:05.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:59:05 compute-1 sudo[215722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:59:05 compute-1 sudo[215722]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:59:05 compute-1 sudo[215722]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:05 compute-1 sudo[215747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Dec 06 09:59:05 compute-1 sudo[215747]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:59:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:05.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:05 compute-1 sudo[215747]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:05 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:05 compute-1 sudo[215793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 09:59:05 compute-1 sudo[215793]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:59:05 compute-1 sudo[215793]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:06 compute-1 sudo[215818]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 09:59:06 compute-1 sudo[215818]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:59:06 compute-1 sudo[215818]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:06 compute-1 sudo[216000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbylorvqvhezvbknyubxamcsxhbquwuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015146.4567285-2290-224551935812343/AnsiballZ_file.py'
Dec 06 09:59:06 compute-1 sudo[216000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:59:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:59:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:59:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:59:06 compute-1 ceph-mon[79770]: pgmap v528: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:59:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 09:59:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:59:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:59:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 09:59:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 09:59:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 09:59:06 compute-1 python3.9[216002]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:06 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:06 compute-1 sudo[216000]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:07.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:07 compute-1 sudo[216152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcmnaememiuznlchggebddagzduvntgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015147.0809941-2290-7885649887614/AnsiballZ_file.py'
Dec 06 09:59:07 compute-1 sudo[216152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:07.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:07 compute-1 python3.9[216154]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:07 compute-1 sudo[216152]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:07 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:08 compute-1 sudo[216304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgzhuitouiorhbmkfbxigzggjmjdrayb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015147.7744837-2290-170945187582472/AnsiballZ_file.py'
Dec 06 09:59:08 compute-1 sudo[216304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:08 compute-1 python3.9[216306]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:08 compute-1 sudo[216304]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:59:08 compute-1 sudo[216457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cycbzmzglvyftsatfwmmixbltlbfqbwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015148.3963535-2290-68302521618450/AnsiballZ_file.py'
Dec 06 09:59:08 compute-1 sudo[216457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:08 compute-1 python3.9[216459]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:08 compute-1 sudo[216457]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:08 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:09 compute-1 sudo[216609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmhkkibkpqkirjfzeywyhzhhttbykngr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015149.0017047-2290-238566118322695/AnsiballZ_file.py'
Dec 06 09:59:09 compute-1 sudo[216609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:09 compute-1 ceph-mon[79770]: pgmap v529: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:59:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:59:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:09.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:09 compute-1 python3.9[216611]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:09 compute-1 sudo[216609]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:09.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:09 compute-1 sudo[216636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:59:09 compute-1 sudo[216636]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:59:09 compute-1 sudo[216636]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:09 compute-1 sudo[216786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfjmxijwvzzvdjciqmsfeeswemuamnxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015149.605782-2290-280063137749283/AnsiballZ_file.py'
Dec 06 09:59:09 compute-1 sudo[216786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:09 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:10 compute-1 python3.9[216788]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:10 compute-1 sudo[216786]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:10 compute-1 sudo[216939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-totygqlkfpnjftjalkwikovsrtsbttct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015150.2161574-2290-253119962487488/AnsiballZ_file.py'
Dec 06 09:59:10 compute-1 sudo[216939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:10 compute-1 python3.9[216941]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:10 compute-1 sudo[216939]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:10 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:11 compute-1 sudo[217091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmozjbmcaaoiclljfkqqcisdtxcaexvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015150.9658635-2290-36010793014546/AnsiballZ_file.py'
Dec 06 09:59:11 compute-1 sudo[217091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:11.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:11 compute-1 python3.9[217093]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:11 compute-1 sudo[217091]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:11 compute-1 ceph-mon[79770]: pgmap v530: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:11.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:11 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:12 compute-1 sudo[217118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 09:59:12 compute-1 sudo[217118]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:59:12 compute-1 sudo[217118]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:12 compute-1 sudo[217269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xviohdqydygllmnarzjvuywxqzpbpixc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015152.3254185-2461-170196260762639/AnsiballZ_file.py'
Dec 06 09:59:12 compute-1 sudo[217269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:12 compute-1 python3.9[217271]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:12 compute-1 sudo[217269]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:12 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:13 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:59:13 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 09:59:13 compute-1 ceph-mon[79770]: pgmap v531: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:13 compute-1 sudo[217421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emxkezvxqfpwtdocpelvnnbmwinjyjid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015153.003485-2461-213300322663082/AnsiballZ_file.py'
Dec 06 09:59:13 compute-1 sudo[217421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:13.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:13 compute-1 python3.9[217423]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:13 compute-1 sudo[217421]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:59:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:13.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:13 compute-1 sudo[217573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onddzhweoobmsaylenegctmbimlzrocz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015153.556075-2461-270943998791462/AnsiballZ_file.py'
Dec 06 09:59:13 compute-1 sudo[217573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:13 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:14 compute-1 python3.9[217575]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:14 compute-1 sudo[217573]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:14 compute-1 sudo[217726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lciyzowsihltmjzrlaqiocaekeqjxkdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015154.1761842-2461-171209146266550/AnsiballZ_file.py'
Dec 06 09:59:14 compute-1 sudo[217726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:14 compute-1 python3.9[217728]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:14 compute-1 sudo[217726]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:14 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e0004080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:15 compute-1 sudo[217878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reqdrudlynwpzcxoxdltbvtfmnoopedo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015154.8508427-2461-137090466119643/AnsiballZ_file.py'
Dec 06 09:59:15 compute-1 sudo[217878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:15 compute-1 ceph-mon[79770]: pgmap v532: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:15 compute-1 python3.9[217880]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:15 compute-1 sudo[217878]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:59:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:15.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:59:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:15.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:15 compute-1 sudo[218030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svbvghtrtwbsfbkkrnqhsjqooujsklij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015155.5300672-2461-124170644745090/AnsiballZ_file.py'
Dec 06 09:59:15 compute-1 sudo[218030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:15 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:15 compute-1 python3.9[218032]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:15 compute-1 sudo[218030]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:16 compute-1 sudo[218183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebxbmzjuxjcvbjdghebdaefdxdhotifj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015156.1119578-2461-87655594221263/AnsiballZ_file.py'
Dec 06 09:59:16 compute-1 sudo[218183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:16 compute-1 python3.9[218185]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:16 compute-1 sudo[218183]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:16 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:16 compute-1 sudo[218335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emfbofffymnsfbmvqzuekqbxysexkjzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015156.6893902-2461-75885836434714/AnsiballZ_file.py'
Dec 06 09:59:16 compute-1 sudo[218335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:17 compute-1 python3.9[218337]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:17 compute-1 sudo[218335]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:17 compute-1 ceph-mon[79770]: pgmap v533: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:17.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:17.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:17 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:18 compute-1 sudo[218488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cayevqysrsrcmlshdozuwtzdzbbcqkan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015158.1923606-2635-102304807703490/AnsiballZ_command.py'
Dec 06 09:59:18 compute-1 sudo[218488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:59:18 compute-1 python3.9[218490]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:59:18 compute-1 sudo[218488]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:18 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:19 compute-1 ceph-mon[79770]: pgmap v534: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:59:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:59:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:19.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:59:19 compute-1 python3.9[218642]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 06 09:59:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:19.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:19 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:20 compute-1 sudo[218793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlosoyfelnaiodksywmtseinhjiuprjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015160.0604663-2689-138994975407506/AnsiballZ_systemd_service.py'
Dec 06 09:59:20 compute-1 sudo[218793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:20 compute-1 python3.9[218795]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 09:59:20 compute-1 systemd[1]: Reloading.
Dec 06 09:59:20 compute-1 systemd-rc-local-generator[218819]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 09:59:20 compute-1 systemd-sysv-generator[218825]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 09:59:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:20 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:21 compute-1 sudo[218793]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:21.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:21 compute-1 ceph-mon[79770]: pgmap v535: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:21.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:21 compute-1 sudo[218980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkvesihgqacfsemdxxhpueiedtuyfcha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015161.4302478-2713-60516969471781/AnsiballZ_command.py'
Dec 06 09:59:21 compute-1 sudo[218980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:21 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:21 compute-1 python3.9[218982]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:59:21 compute-1 sudo[218980]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:22 compute-1 sudo[219134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kspxaaeqydhgfonmajoncoddjvexcfhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015162.1071556-2713-223535058405931/AnsiballZ_command.py'
Dec 06 09:59:22 compute-1 sudo[219134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:22 compute-1 python3.9[219136]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:59:22 compute-1 sudo[219134]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:22 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:23 compute-1 sudo[219287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkkhmsrqvyykbgysmydgusapokfaiqod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015162.7713485-2713-51766762162458/AnsiballZ_command.py'
Dec 06 09:59:23 compute-1 sudo[219287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:23 compute-1 python3.9[219289]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:59:23 compute-1 sudo[219287]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:23.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:59:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:23.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:23 compute-1 ceph-mon[79770]: pgmap v536: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:23 compute-1 sudo[219440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hthjpcvsdfnqybagdebmsupaspgjzaih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015163.3918045-2713-103204243530935/AnsiballZ_command.py'
Dec 06 09:59:23 compute-1 sudo[219440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:23 compute-1 python3.9[219442]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:59:23 compute-1 sudo[219440]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:23 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:24 compute-1 sudo[219594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugqqxxivdwumvjgdnbpkfywegjdhhock ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015164.0227425-2713-265750321390457/AnsiballZ_command.py'
Dec 06 09:59:24 compute-1 sudo[219594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:59:24 compute-1 python3.9[219596]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:59:24 compute-1 sudo[219594]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:24 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:25 compute-1 sudo[219758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrjwqmtcwvirnxfsgqguntlukduftass ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015164.82501-2713-138936320300997/AnsiballZ_command.py'
Dec 06 09:59:25 compute-1 sudo[219758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:25 compute-1 podman[219721]: 2025-12-06 09:59:25.154378875 +0000 UTC m=+0.095846784 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 09:59:25 compute-1 python3.9[219766]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:59:25 compute-1 sudo[219758]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:25.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:25.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:25 compute-1 ceph-mon[79770]: pgmap v537: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:25 compute-1 sudo[219927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjubinrjwicjboqpjjvsduxfakcbbbch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015165.4670277-2713-215097174703203/AnsiballZ_command.py'
Dec 06 09:59:25 compute-1 sudo[219927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:25 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:25 compute-1 python3.9[219929]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:59:25 compute-1 sudo[219927]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:26 compute-1 sudo[220081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dacruvqdmxgzzbbjiflhouieiwcsgjdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015166.1227944-2713-279490433678436/AnsiballZ_command.py'
Dec 06 09:59:26 compute-1 sudo[220081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:26 compute-1 python3.9[220083]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 06 09:59:26 compute-1 sudo[220081]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:26 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:27.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:27.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:27 compute-1 ceph-mon[79770]: pgmap v538: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:27 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:28 compute-1 sudo[220235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqbdfrhhfgmzpupetjoizcorkdqgykch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015168.1640296-2920-191769437056037/AnsiballZ_file.py'
Dec 06 09:59:28 compute-1 sudo[220235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:59:28 compute-1 python3.9[220237]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:28 compute-1 sudo[220235]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:28 compute-1 ceph-mon[79770]: pgmap v539: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:59:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:28 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:29 compute-1 sudo[220387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcbsobbmwlltsdgkacfikldrcnltyldz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015168.8571997-2920-154620586924493/AnsiballZ_file.py'
Dec 06 09:59:29 compute-1 sudo[220387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:29 compute-1 python3.9[220389]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:29 compute-1 sudo[220387]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:29.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:29.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:29 compute-1 sudo[220489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:59:29 compute-1 sudo[220489]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:59:29 compute-1 sudo[220489]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:29 compute-1 sudo[220579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blgatbapvbhikboqwnmtvdvqdslczfyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015169.4738352-2920-193914098434286/AnsiballZ_file.py'
Dec 06 09:59:29 compute-1 podman[220536]: 2025-12-06 09:59:29.748441219 +0000 UTC m=+0.053627308 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 09:59:29 compute-1 sudo[220579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f4004410 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:29 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:29 compute-1 python3.9[220583]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:29 compute-1 sudo[220579]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:30 compute-1 sudo[220734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seyadjahduwyudwutvqiexznnjbfjwck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015170.3875885-2986-206656120873364/AnsiballZ_file.py'
Dec 06 09:59:30 compute-1 sudo[220734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:30 compute-1 python3.9[220736]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:30 compute-1 sudo[220734]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:30 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:31 compute-1 sudo[220887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anoukkuppoyfkjspyubggioihleehbiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015171.014529-2986-211687490441023/AnsiballZ_file.py'
Dec 06 09:59:31 compute-1 sudo[220887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:31.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:31 compute-1 ceph-mon[79770]: pgmap v540: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:31 compute-1 python3.9[220889]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:31 compute-1 sudo[220887]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:31.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb20c001470 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:31 compute-1 sudo[221039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mppcghqqydkwkxociqdulvihptsqgngz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015171.6301334-2986-185239895620786/AnsiballZ_file.py'
Dec 06 09:59:31 compute-1 sudo[221039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:31 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:32 compute-1 python3.9[221041]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:32 compute-1 sudo[221039]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:32 compute-1 sudo[221193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtgwjkwmzkqrtobfvwsmxhfglzeypnzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015172.2587123-2986-127819183416573/AnsiballZ_file.py'
Dec 06 09:59:32 compute-1 sudo[221193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:32 compute-1 python3.9[221195]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:32 compute-1 sudo[221193]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:32 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:33 compute-1 sudo[221345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hepliwblzlmpncwbvxzvsciigxnnxmix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015173.082411-2986-241166402477852/AnsiballZ_file.py'
Dec 06 09:59:33 compute-1 sudo[221345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:33.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:33 compute-1 ceph-mon[79770]: pgmap v541: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:59:33 compute-1 python3.9[221347]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:33.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:33 compute-1 sudo[221345]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:33 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:34 compute-1 sudo[221497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbqwsfliikeadpqxnnbhxejjutnxxywc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015173.7954216-2986-16090245996560/AnsiballZ_file.py'
Dec 06 09:59:34 compute-1 sudo[221497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:34 compute-1 python3.9[221499]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:34 compute-1 sudo[221497]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:34 compute-1 podman[221501]: 2025-12-06 09:59:34.34917164 +0000 UTC m=+0.059262868 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 06 09:59:34 compute-1 sudo[221670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwraseouzqfdawvmqentrhawpmlhlket ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015174.4395688-2986-175846386798181/AnsiballZ_file.py'
Dec 06 09:59:34 compute-1 sudo[221670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:34 compute-1 python3.9[221672]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:34 compute-1 sudo[221670]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:34 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:35.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:35 compute-1 ceph-mon[79770]: pgmap v542: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:35.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:35 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 09:59:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 7804 writes, 31K keys, 7804 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 7804 writes, 1639 syncs, 4.76 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 625 writes, 1051 keys, 625 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s
                                           Interval WAL: 625 writes, 306 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 06 09:59:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:36 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:37.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:37 compute-1 ceph-mon[79770]: pgmap v543: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:37.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:37 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:59:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:38 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:39.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:39 compute-1 ceph-mon[79770]: pgmap v544: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 09:59:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:59:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:39.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:39 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:40 compute-1 sudo[221825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wunccvqhngbtyxpmaqckiwtrlgzblawg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015180.3607776-3311-26534536475633/AnsiballZ_getent.py'
Dec 06 09:59:40 compute-1 sudo[221825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:40 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:40 compute-1 python3.9[221827]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 06 09:59:41 compute-1 sudo[221825]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:41.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:41 compute-1 ceph-mon[79770]: pgmap v545: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:59:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:41.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:59:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002f10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:41 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1e00049a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:41 compute-1 sudo[221978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swqkefqhdtfofadtopsxzaisltdubfdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015181.449419-3335-120598429553463/AnsiballZ_group.py'
Dec 06 09:59:41 compute-1 sudo[221978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:42 compute-1 python3.9[221980]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 06 09:59:42 compute-1 groupadd[221981]: group added to /etc/group: name=nova, GID=42436
Dec 06 09:59:42 compute-1 groupadd[221981]: group added to /etc/gshadow: name=nova
Dec 06 09:59:42 compute-1 groupadd[221981]: new group: name=nova, GID=42436
Dec 06 09:59:42 compute-1 sudo[221978]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:42 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb214002ff0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:43 compute-1 sudo[222137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wegzqtrwauoejrzmhsrswwlkesgngbor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015182.5379677-3359-257503160856093/AnsiballZ_user.py'
Dec 06 09:59:43 compute-1 sudo[222137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:43 compute-1 python3.9[222139]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 06 09:59:43 compute-1 useradd[222141]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Dec 06 09:59:43 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:59:43 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 09:59:43 compute-1 useradd[222141]: add 'nova' to group 'libvirt'
Dec 06 09:59:43 compute-1 useradd[222141]: add 'nova' to shadow group 'libvirt'
Dec 06 09:59:43 compute-1 sudo[222137]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:59:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:43.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:59:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:59:43 compute-1 ceph-mon[79770]: pgmap v546: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:43.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb218008be0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 09:59:43 compute-1 kernel: ganesha.nfsd[221042]: segfault at 50 ip 00007fb2c8a7832e sp 00007fb28d7f9210 error 4 in libntirpc.so.5.8[7fb2c8a5d000+2c000] likely on CPU 7 (core 0, socket 7)
Dec 06 09:59:43 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 06 09:59:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[166425]: 06/12/2025 09:59:43 : epoch 6933fd56 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb1f0002f10 fd 39 proxy ignored for local
Dec 06 09:59:43 compute-1 systemd[1]: Started Process Core Dump (PID 222173/UID 0).
Dec 06 09:59:44 compute-1 sshd-session[222176]: Accepted publickey for zuul from 192.168.122.30 port 36846 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 09:59:44 compute-1 systemd-logind[788]: New session 54 of user zuul.
Dec 06 09:59:44 compute-1 systemd[1]: Started Session 54 of User zuul.
Dec 06 09:59:44 compute-1 sshd-session[222176]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 09:59:44 compute-1 sshd-session[222179]: Received disconnect from 192.168.122.30 port 36846:11: disconnected by user
Dec 06 09:59:44 compute-1 sshd-session[222179]: Disconnected from user zuul 192.168.122.30 port 36846
Dec 06 09:59:44 compute-1 sshd-session[222176]: pam_unix(sshd:session): session closed for user zuul
Dec 06 09:59:44 compute-1 systemd-logind[788]: Session 54 logged out. Waiting for processes to exit.
Dec 06 09:59:44 compute-1 systemd[1]: session-54.scope: Deactivated successfully.
Dec 06 09:59:44 compute-1 systemd-logind[788]: Removed session 54.
Dec 06 09:59:45 compute-1 python3.9[222329]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:59:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 09:59:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:45.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 09:59:45 compute-1 ceph-mon[79770]: pgmap v547: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:45.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:45 compute-1 systemd-coredump[222174]: Process 166453 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 77:
                                                    #0  0x00007fb2c8a7832e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 06 09:59:45 compute-1 systemd[1]: systemd-coredump@6-222173-0.service: Deactivated successfully.
Dec 06 09:59:45 compute-1 systemd[1]: systemd-coredump@6-222173-0.service: Consumed 1.800s CPU time.
Dec 06 09:59:45 compute-1 podman[222412]: 2025-12-06 09:59:45.946222368 +0000 UTC m=+0.040767540 container died 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Dec 06 09:59:45 compute-1 systemd[1]: var-lib-containers-storage-overlay-674595a2f8d871ddef4522155fda703c933fe31e7b86dbc4d96e00021066cf79-merged.mount: Deactivated successfully.
Dec 06 09:59:46 compute-1 podman[222412]: 2025-12-06 09:59:46.006044429 +0000 UTC m=+0.100589581 container remove 59c3a18112ee7376f7e084c537acf33fec4744253b3178b4083465a9740dedf8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:59:46 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec 06 09:59:46 compute-1 python3.9[222467]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015184.784028-3434-215211136421978/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:46 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec 06 09:59:46 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.578s CPU time.
Dec 06 09:59:46 compute-1 python3.9[222646]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:59:47 compute-1 python3.9[222722]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:47.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:47 compute-1 ceph-mon[79770]: pgmap v548: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 09:59:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:47.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:48 compute-1 python3.9[222872]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:59:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:59:48 compute-1 python3.9[222994]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015187.5572815-3434-81769474876170/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:49 compute-1 python3.9[223144]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:59:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:49.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:49 compute-1 ceph-mon[79770]: pgmap v549: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 09:59:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:49.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:49 compute-1 sudo[223266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 09:59:49 compute-1 sudo[223266]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 09:59:49 compute-1 sudo[223266]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:49 compute-1 python3.9[223265]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015188.880816-3434-61046852501847/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/095949 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 09:59:50 compute-1 python3.9[223441]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:59:50 compute-1 python3.9[223562]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015190.0367112-3434-130095729765037/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:51.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:51 compute-1 python3.9[223712]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:59:51 compute-1 ceph-mon[79770]: pgmap v550: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:59:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:51.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:52 compute-1 python3.9[223833]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015191.130659-3434-40697051335534/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:52 compute-1 sudo[223984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkxjvduexwiegdujdwckjbxgmetskarf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015192.5787766-3683-263632660488831/AnsiballZ_file.py'
Dec 06 09:59:52 compute-1 sudo[223984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:53 compute-1 python3.9[223986]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:53 compute-1 sudo[223984]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:53.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:59:53 compute-1 ceph-mon[79770]: pgmap v551: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:59:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:53.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:53 compute-1 sudo[224136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twzfjkwoxzjzlaneqwttgaptsepnmsfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015193.4020598-3707-4816206226934/AnsiballZ_copy.py'
Dec 06 09:59:53 compute-1 sudo[224136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:53 compute-1 python3.9[224138]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 09:59:53 compute-1 sudo[224136]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:59:54.270 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 09:59:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:59:54.272 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 09:59:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 09:59:54.272 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 09:59:54 compute-1 sudo[224289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oysabjbmczcgwggwnvodzskceiwncvjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015194.1690361-3731-186060435039358/AnsiballZ_stat.py'
Dec 06 09:59:54 compute-1 sudo[224289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:54 compute-1 python3.9[224291]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:59:54 compute-1 sudo[224289]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 09:59:55 compute-1 sudo[224441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbetffvxbhusaggxehfnqnbsthhqiuxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015194.9025126-3756-143507184368725/AnsiballZ_stat.py'
Dec 06 09:59:55 compute-1 sudo[224441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:55 compute-1 podman[224443]: 2025-12-06 09:59:55.37023775 +0000 UTC m=+0.138189412 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 09:59:55 compute-1 python3.9[224444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:59:55 compute-1 sudo[224441]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:55.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:55.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:55 compute-1 ceph-mon[79770]: pgmap v552: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:59:55 compute-1 sudo[224590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omssnncyegjjrcpfxqmqqfkzcxyuuhns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015194.9025126-3756-143507184368725/AnsiballZ_copy.py'
Dec 06 09:59:55 compute-1 sudo[224590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 09:59:55 compute-1 python3.9[224592]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1765015194.9025126-3756-143507184368725/.source _original_basename=.u2lyrnjp follow=False checksum=1b13389afdbc18c3b0e4207972a5a874c4fd04bf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec 06 09:59:55 compute-1 sudo[224590]: pam_unix(sudo:session): session closed for user root
Dec 06 09:59:56 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 7.
Dec 06 09:59:56 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:59:56 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 2.578s CPU time.
Dec 06 09:59:56 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 09:59:56 compute-1 podman[224711]: 2025-12-06 09:59:56.660122724 +0000 UTC m=+0.046867562 container create 6dc139c09dbc99a313d5333e87cc0ba0df15ffda5b12614866d45ea226e1d6ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 06 09:59:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c79bfeb25e587d3943a06906c158dd3f62a52f59079e06c39c4ba774c28c036/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 06 09:59:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c79bfeb25e587d3943a06906c158dd3f62a52f59079e06c39c4ba774c28c036/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 09:59:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c79bfeb25e587d3943a06906c158dd3f62a52f59079e06c39c4ba774c28c036/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:59:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c79bfeb25e587d3943a06906c158dd3f62a52f59079e06c39c4ba774c28c036/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 09:59:56 compute-1 podman[224711]: 2025-12-06 09:59:56.732124976 +0000 UTC m=+0.118869834 container init 6dc139c09dbc99a313d5333e87cc0ba0df15ffda5b12614866d45ea226e1d6ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0)
Dec 06 09:59:56 compute-1 podman[224711]: 2025-12-06 09:59:56.640845036 +0000 UTC m=+0.027589904 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 09:59:56 compute-1 podman[224711]: 2025-12-06 09:59:56.737847238 +0000 UTC m=+0.124592076 container start 6dc139c09dbc99a313d5333e87cc0ba0df15ffda5b12614866d45ea226e1d6ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 09:59:56 compute-1 bash[224711]: 6dc139c09dbc99a313d5333e87cc0ba0df15ffda5b12614866d45ea226e1d6ce
Dec 06 09:59:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 06 09:59:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 06 09:59:56 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 09:59:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 06 09:59:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 06 09:59:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 06 09:59:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 06 09:59:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 06 09:59:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 09:59:56 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 09:59:56 compute-1 python3.9[224826]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 09:59:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:57.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:57.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:57 compute-1 ceph-mon[79770]: pgmap v553: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 09:59:57 compute-1 python3.9[225000]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:59:58 compute-1 python3.9[225121]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015197.3208287-3833-205626550502035/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=81f1f28d070b2613355f782b83a5777fdba9540e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 09:59:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 09:59:59 compute-1 python3.9[225272]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 06 09:59:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 09:59:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:09:59:59.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 09:59:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 09:59:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 09:59:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:09:59:59.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 09:59:59 compute-1 ceph-mon[79770]: pgmap v554: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Dec 06 09:59:59 compute-1 python3.9[225393]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765015198.7398415-3878-200401887995763/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=2efe6ae78bce1c26d2c384be079fa366810076ad backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 06 10:00:00 compute-1 ceph-mon[79770]: overall HEALTH_OK
Dec 06 10:00:00 compute-1 sudo[225558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izllzkezfblskobonochazyphpzrbmiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015200.404695-3929-249904034850503/AnsiballZ_container_config_data.py'
Dec 06 10:00:00 compute-1 sudo[225558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:00:00 compute-1 podman[225518]: 2025-12-06 10:00:00.770509755 +0000 UTC m=+0.086847142 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:00:00 compute-1 python3.9[225562]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 06 10:00:00 compute-1 sudo[225558]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:01.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:01 compute-1 sudo[225715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayriwbleljaaojhtaxmopcvqrobwpwjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015201.2727444-3956-166853081489069/AnsiballZ_container_config_hash.py'
Dec 06 10:00:01 compute-1 sudo[225715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:00:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:01.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:01 compute-1 python3.9[225717]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 10:00:01 compute-1 ceph-mon[79770]: pgmap v555: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:00:01 compute-1 sudo[225715]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:02 compute-1 sudo[225868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpeyewtrpauvdqzktjseepcsdtujcxjh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765015202.2247694-3986-130748322748961/AnsiballZ_edpm_container_manage.py'
Dec 06 10:00:02 compute-1 sudo[225868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:00:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:02 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:00:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:02 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:00:02 compute-1 python3[225870]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 10:00:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:00:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:03.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:00:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:00:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:03.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:03 compute-1 ceph-mon[79770]: pgmap v556: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:00:04 compute-1 podman[225908]: 2025-12-06 10:00:04.764858913 +0000 UTC m=+0.067270727 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:00:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:00:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:05.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:00:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:05.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:05 compute-1 ceph-mon[79770]: pgmap v557: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 10:00:07 compute-1 ceph-mon[79770]: pgmap v558: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 10:00:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:00:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:07.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:00:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:07.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 10:00:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:00:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:09.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:00:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:09.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4e0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:09 compute-1 sudo[225986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:00:09 compute-1 sudo[225986]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:00:09 compute-1 sudo[225986]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:10 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:11.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:11.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:11 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100011 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 10:00:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:11 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:12 compute-1 sudo[226018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:00:12 compute-1 sudo[226018]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:00:12 compute-1 sudo[226018]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:12 compute-1 sudo[226043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:00:12 compute-1 sudo[226043]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:00:12 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:00:12 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 3608 writes, 20K keys, 3608 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.05 MB/s
                                           Cumulative WAL: 3607 writes, 3607 syncs, 1.00 writes per sync, written: 0.05 GB, 0.05 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1396 writes, 6521 keys, 1396 commit groups, 1.0 writes per commit group, ingest: 16.14 MB, 0.03 MB/s
                                           Interval WAL: 1395 writes, 1395 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    111.7      0.26              0.12         9    0.029       0      0       0.0       0.0
                                             L6      1/0   13.34 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.6    115.1    100.1      1.04              0.39         8    0.130     38K   4138       0.0       0.0
                                            Sum      1/0   13.34 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.6     92.1    102.4      1.30              0.51        17    0.076     38K   4138       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   6.6    112.5    112.4      0.40              0.14         6    0.067     16K   1857       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    115.1    100.1      1.04              0.39         8    0.130     38K   4138       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    112.5      0.26              0.12         8    0.032       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.028, interval 0.007
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.13 GB write, 0.11 MB/s write, 0.12 GB read, 0.10 MB/s read, 1.3 seconds
                                           Interval compaction: 0.04 GB write, 0.08 MB/s write, 0.04 GB read, 0.08 MB/s read, 0.4 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fbbecff350#2 capacity: 304.00 MB usage: 5.00 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000146 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(268,4.67 MB,1.53495%) FilterBlock(17,116.36 KB,0.037379%) IndexBlock(17,221.86 KB,0.0712696%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 06 10:00:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:12 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:13.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:00:13 compute-1 podman[225883]: 2025-12-06 10:00:13.579071167 +0000 UTC m=+10.636459758 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5
Dec 06 10:00:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:00:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:13.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:00:13 compute-1 podman[226111]: 2025-12-06 10:00:13.750799868 +0000 UTC m=+0.052502231 container create 5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute_init, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:00:13 compute-1 podman[226111]: 2025-12-06 10:00:13.723337738 +0000 UTC m=+0.025040121 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5
Dec 06 10:00:13 compute-1 python3[225870]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5 bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 06 10:00:13 compute-1 sudo[226043]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:13 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c4001140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:13 compute-1 sudo[225868]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:13 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:15 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:15.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:00:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:15.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:00:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:15 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:15 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c4001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:16 compute-1 ceph-mds[84241]: mds.beacon.cephfs.compute-1.fpvjgb missed beacon ack from the monitors
Dec 06 10:00:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:17 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:17.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:17 compute-1 ceph-mon[79770]: pgmap v559: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 10:00:17 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:00:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:17.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:17 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:17 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:18 compute-1 sudo[226313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mugzzndcjwcaonyfwykiavcbqmtoxwel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015218.185917-4010-15619558002841/AnsiballZ_stat.py'
Dec 06 10:00:18 compute-1 sudo[226313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:00:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:00:18 compute-1 ceph-mon[79770]: pgmap v560: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Dec 06 10:00:18 compute-1 ceph-mon[79770]: pgmap v561: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Dec 06 10:00:18 compute-1 ceph-mon[79770]: pgmap v562: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 10:00:18 compute-1 ceph-mon[79770]: pgmap v563: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:00:18 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:00:18 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:00:18 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:00:18 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:00:18 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:00:18 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:00:18 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:00:18 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:00:18 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:00:18 compute-1 python3.9[226315]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 10:00:18 compute-1 sudo[226313]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:19 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c4001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:19.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:19 compute-1 sudo[226467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpevcfuucvjonnhwfpfaqidtcttfanrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015219.3157873-4047-42163676156589/AnsiballZ_container_config_data.py'
Dec 06 10:00:19 compute-1 sudo[226467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:00:19 compute-1 ceph-mon[79770]: pgmap v564: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:00:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:19.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:19 compute-1 python3.9[226469]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 06 10:00:19 compute-1 sudo[226467]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:19 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:19 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:20 compute-1 sudo[226620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhxlmcitwppcnfwqfzzustgbfkfmsncb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015220.0778534-4073-262326604304449/AnsiballZ_container_config_hash.py'
Dec 06 10:00:20 compute-1 sudo[226620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:00:20 compute-1 python3.9[226622]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 06 10:00:20 compute-1 sudo[226620]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:21 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8001b40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:21 compute-1 ceph-mon[79770]: pgmap v565: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 06 10:00:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:00:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:21.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:00:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:00:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:21.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:00:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:21 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c4001c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:21 compute-1 sudo[226772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwlimqndmnmjsymleyylybbmlacvbydg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765015221.689253-4103-172283726242790/AnsiballZ_edpm_container_manage.py'
Dec 06 10:00:21 compute-1 sudo[226772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:00:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:21 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:22 compute-1 python3[226774]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 06 10:00:22 compute-1 podman[226809]: 2025-12-06 10:00:22.395790383 +0000 UTC m=+0.051677981 container create fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:00:22 compute-1 podman[226809]: 2025-12-06 10:00:22.368652151 +0000 UTC m=+0.024539769 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5
Dec 06 10:00:22 compute-1 python3[226774]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5 kolla_start
Dec 06 10:00:22 compute-1 sudo[226772]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:23 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:23 compute-1 sudo[226995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srcytxbnliomecqldizstyoxzbwhernm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015222.7890723-4127-183733978912087/AnsiballZ_stat.py'
Dec 06 10:00:23 compute-1 sudo[226995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:00:23 compute-1 python3.9[226997]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 10:00:23 compute-1 sudo[226995]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:23 compute-1 ceph-mon[79770]: pgmap v566: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 06 10:00:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:00:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:23.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:00:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:00:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:23.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:23 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:23 compute-1 sudo[227149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxsccipzdwbmicusrecyxpmxpizyxzib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015223.6474938-4154-276687218243430/AnsiballZ_file.py'
Dec 06 10:00:23 compute-1 sudo[227149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:00:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:23 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:24 compute-1 python3.9[227151]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 10:00:24 compute-1 sudo[227152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:00:24 compute-1 sudo[227152]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:00:24 compute-1 sudo[227152]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:24 compute-1 sudo[227149]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:24 compute-1 sudo[227326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caawubnolwzclpybsnyifulhffhwazka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015224.1947293-4154-108310403032688/AnsiballZ_copy.py'
Dec 06 10:00:24 compute-1 sudo[227326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:00:24 compute-1 python3.9[227328]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765015224.1947293-4154-108310403032688/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 06 10:00:24 compute-1 sudo[227326]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:00:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:00:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:00:24 compute-1 ceph-mon[79770]: pgmap v567: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Dec 06 10:00:24 compute-1 sudo[227402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eptqypjtlxyimzekjlsqtmlxjdvlbark ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015224.1947293-4154-108310403032688/AnsiballZ_systemd.py'
Dec 06 10:00:24 compute-1 sudo[227402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:00:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:25 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:25 compute-1 python3.9[227404]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 06 10:00:25 compute-1 systemd[1]: Reloading.
Dec 06 10:00:25 compute-1 systemd-rc-local-generator[227427]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:00:25 compute-1 systemd-sysv-generator[227434]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:00:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:00:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:25.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:00:25 compute-1 sudo[227402]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:25.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:25 compute-1 podman[227441]: 2025-12-06 10:00:25.760670438 +0000 UTC m=+0.113092224 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:00:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:25 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:25 compute-1 sudo[227540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyfiupxjwkjmbvufifpntqyxgrhwqssv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015224.1947293-4154-108310403032688/AnsiballZ_systemd.py'
Dec 06 10:00:25 compute-1 sudo[227540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:00:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:25 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:26 compute-1 python3.9[227542]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 06 10:00:26 compute-1 systemd[1]: Reloading.
Dec 06 10:00:26 compute-1 systemd-rc-local-generator[227572]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 06 10:00:26 compute-1 systemd-sysv-generator[227575]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 06 10:00:26 compute-1 systemd[1]: Starting nova_compute container...
Dec 06 10:00:26 compute-1 systemd[1]: Started libcrun container.
Dec 06 10:00:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 06 10:00:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 06 10:00:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 10:00:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 10:00:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 10:00:26 compute-1 podman[227582]: 2025-12-06 10:00:26.747229921 +0000 UTC m=+0.118631609 container init fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute)
Dec 06 10:00:26 compute-1 podman[227582]: 2025-12-06 10:00:26.753669477 +0000 UTC m=+0.125071145 container start fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute)
Dec 06 10:00:26 compute-1 podman[227582]: nova_compute
Dec 06 10:00:26 compute-1 nova_compute[227597]: + sudo -E kolla_set_configs
Dec 06 10:00:26 compute-1 systemd[1]: Started nova_compute container.
Dec 06 10:00:26 compute-1 sudo[227540]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Validating config file
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Copying service configuration files
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Deleting /etc/ceph
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Creating directory /etc/ceph
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /etc/ceph
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Writing out command to execute
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 10:00:26 compute-1 nova_compute[227597]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 10:00:26 compute-1 nova_compute[227597]: ++ cat /run_command
Dec 06 10:00:26 compute-1 nova_compute[227597]: + CMD=nova-compute
Dec 06 10:00:26 compute-1 nova_compute[227597]: + ARGS=
Dec 06 10:00:26 compute-1 nova_compute[227597]: + sudo kolla_copy_cacerts
Dec 06 10:00:26 compute-1 nova_compute[227597]: + [[ ! -n '' ]]
Dec 06 10:00:26 compute-1 nova_compute[227597]: + . kolla_extend_start
Dec 06 10:00:26 compute-1 nova_compute[227597]: + echo 'Running command: '\''nova-compute'\'''
Dec 06 10:00:26 compute-1 nova_compute[227597]: Running command: 'nova-compute'
Dec 06 10:00:26 compute-1 nova_compute[227597]: + umask 0022
Dec 06 10:00:26 compute-1 nova_compute[227597]: + exec nova-compute
Dec 06 10:00:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:27 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:27 compute-1 ceph-osd[77465]: bluestore.MempoolThread fragmentation_score=0.000028 took=0.000254s
Dec 06 10:00:27 compute-1 ceph-mon[79770]: pgmap v568: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:00:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:27.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:00:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:27.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:00:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:27 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40030f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:27 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:28 compute-1 python3.9[227759]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 10:00:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:00:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:29 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:29 compute-1 python3.9[227910]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 10:00:29 compute-1 nova_compute[227597]: 2025-12-06 10:00:29.318 227601 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 10:00:29 compute-1 nova_compute[227597]: 2025-12-06 10:00:29.319 227601 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 10:00:29 compute-1 nova_compute[227597]: 2025-12-06 10:00:29.319 227601 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 10:00:29 compute-1 nova_compute[227597]: 2025-12-06 10:00:29.320 227601 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 06 10:00:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:00:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:29.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:00:29 compute-1 nova_compute[227597]: 2025-12-06 10:00:29.536 227601 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:00:29 compute-1 nova_compute[227597]: 2025-12-06 10:00:29.554 227601 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:00:29 compute-1 nova_compute[227597]: 2025-12-06 10:00:29.555 227601 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.643778) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229644824, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1264, "num_deletes": 251, "total_data_size": 3254279, "memory_usage": 3290384, "flush_reason": "Manual Compaction"}
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229665057, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 2113460, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19476, "largest_seqno": 20735, "table_properties": {"data_size": 2107863, "index_size": 2989, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11905, "raw_average_key_size": 19, "raw_value_size": 2096729, "raw_average_value_size": 3506, "num_data_blocks": 132, "num_entries": 598, "num_filter_entries": 598, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015121, "oldest_key_time": 1765015121, "file_creation_time": 1765015229, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 21313 microseconds, and 7957 cpu microseconds.
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.665113) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 2113460 bytes OK
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.665139) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.670916) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.670930) EVENT_LOG_v1 {"time_micros": 1765015229670925, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.670951) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 3248216, prev total WAL file size 3248216, number of live WAL files 2.
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.671969) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(2063KB)], [36(13MB)]
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229672499, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 16100917, "oldest_snapshot_seqno": -1}
Dec 06 10:00:29 compute-1 ceph-mon[79770]: pgmap v569: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 10:00:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:00:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:29.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 5036 keys, 13954767 bytes, temperature: kUnknown
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229746546, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 13954767, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13919523, "index_size": 21566, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12613, "raw_key_size": 128338, "raw_average_key_size": 25, "raw_value_size": 13826447, "raw_average_value_size": 2745, "num_data_blocks": 885, "num_entries": 5036, "num_filter_entries": 5036, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015229, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.746874) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 13954767 bytes
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.748525) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 217.1 rd, 188.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 13.3 +0.0 blob) out(13.3 +0.0 blob), read-write-amplify(14.2) write-amplify(6.6) OK, records in: 5556, records dropped: 520 output_compression: NoCompression
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.748542) EVENT_LOG_v1 {"time_micros": 1765015229748534, "job": 20, "event": "compaction_finished", "compaction_time_micros": 74147, "compaction_time_cpu_micros": 34413, "output_level": 6, "num_output_files": 1, "total_output_size": 13954767, "num_input_records": 5556, "num_output_records": 5036, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229748992, "job": 20, "event": "table_file_deletion", "file_number": 38}
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015229751492, "job": 20, "event": "table_file_deletion", "file_number": 36}
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.671821) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.751565) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.751571) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.751573) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.751575) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:00:29 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:00:29.751577) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:00:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:29 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:29 compute-1 python3.9[228064]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 06 10:00:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:29 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:30 compute-1 sudo[228065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:00:30 compute-1 sudo[228065]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:00:30 compute-1 sudo[228065]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.197 227601 INFO nova.virt.driver [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.342 227601 INFO nova.compute.provider_config [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.385 227601 DEBUG oslo_concurrency.lockutils [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.385 227601 DEBUG oslo_concurrency.lockutils [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.386 227601 DEBUG oslo_concurrency.lockutils [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.386 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.386 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.387 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.387 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.387 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.387 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.387 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.387 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.387 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.388 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.388 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.388 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.388 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.388 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.388 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.389 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.389 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.389 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.389 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.389 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.389 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.389 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.390 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.390 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.390 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.390 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.390 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.390 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.391 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.391 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.391 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.391 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.391 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.391 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.391 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.392 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.392 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.392 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.392 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.392 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.392 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.393 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.393 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.393 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.393 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.393 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.393 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.394 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.394 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.394 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.394 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.394 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.394 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.394 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.395 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.395 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.395 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.395 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.395 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.395 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.395 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.396 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.396 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.396 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.396 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.396 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.396 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.396 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.397 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.397 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.397 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.397 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.397 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.397 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.397 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.398 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.398 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.398 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.398 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.398 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.398 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.398 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.399 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.399 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.399 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.399 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.399 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.399 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.399 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.400 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.400 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.400 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.400 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.400 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.400 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.400 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.401 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.401 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.401 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.401 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.401 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.401 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.401 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.402 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.402 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.402 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.402 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.402 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.402 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.402 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.403 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.403 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.403 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.403 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.403 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.403 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.403 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.404 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.404 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.404 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.404 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.404 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.404 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.404 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.405 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.405 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.405 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.405 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.405 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.405 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.405 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.406 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.406 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.406 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.406 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.406 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.406 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.406 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.407 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.407 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.407 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.407 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.407 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.407 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.407 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.408 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.408 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.408 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.408 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.408 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.408 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.408 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.409 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.409 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.409 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.409 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.409 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.409 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.410 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.410 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.410 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.410 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.410 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.410 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.411 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.411 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.411 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.411 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.411 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.411 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.411 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.412 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.412 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.412 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.412 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.412 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.412 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.413 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.413 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.413 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.413 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.413 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.413 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.413 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.414 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.414 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.414 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.414 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.414 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.414 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.414 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.415 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.415 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.415 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.415 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.415 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.415 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.415 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.416 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.416 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.416 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.416 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.416 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.416 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.417 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.417 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.417 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.417 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.417 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.417 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.417 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.418 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.418 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.418 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.418 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.418 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.418 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.419 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.419 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.419 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.419 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.419 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.419 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.420 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.420 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.420 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.420 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.420 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.420 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.420 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.421 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.421 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.421 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.421 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.421 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.421 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.422 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.422 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.468 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.469 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.469 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.470 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.470 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.470 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.470 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.470 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.470 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.471 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.471 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.471 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.471 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.471 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.471 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.471 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.472 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.472 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.472 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.472 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.472 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.472 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.473 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.473 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.473 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.473 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.473 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.473 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.473 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.474 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.474 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.474 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.474 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.474 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.474 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.474 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.475 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.475 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.475 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.475 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.475 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.475 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.476 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.476 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.476 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.476 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.476 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.476 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.476 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.477 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.477 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.477 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.477 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.477 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.477 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.477 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.478 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.478 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.478 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.478 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.478 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.478 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.478 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.479 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.479 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.479 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.479 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.479 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.479 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.479 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.480 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.480 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.480 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.480 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.480 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.480 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.481 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.481 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.481 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.481 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.481 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.481 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.481 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.482 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.482 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.482 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.482 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.482 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.482 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.483 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.483 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.483 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.483 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.483 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.483 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.484 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.484 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.484 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.484 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.484 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.484 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.485 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.485 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.485 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.485 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.485 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.485 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.486 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.486 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.486 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.486 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.486 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.486 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.486 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.487 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.487 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.487 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.487 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.488 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.488 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.488 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.488 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.488 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.488 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.489 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.489 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.489 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.489 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.489 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.489 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.489 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.490 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.490 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.490 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.490 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.490 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.490 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.490 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.491 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.491 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.491 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.491 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.491 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.491 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.492 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.492 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.492 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.492 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.492 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.492 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.493 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.493 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.493 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.493 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.493 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.493 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.493 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.494 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.494 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.494 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.494 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.494 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.494 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.494 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.495 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.495 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.495 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.495 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.495 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.495 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.495 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.496 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.496 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.496 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.496 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.496 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.496 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.497 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.497 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.497 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.497 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.497 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.497 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.497 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.498 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.498 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.498 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.498 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.498 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.499 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.499 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.499 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.499 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.499 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.499 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.499 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.500 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.500 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.500 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.500 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.500 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.500 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.501 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.501 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.501 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.501 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.501 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.502 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.502 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.502 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.502 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.502 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.502 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.503 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.503 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.503 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.503 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.503 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.503 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.504 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.504 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.504 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.504 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.504 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.504 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.504 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.505 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.505 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.505 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.505 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.505 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.505 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.505 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.506 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.506 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.506 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.506 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.506 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.506 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.506 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.507 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.507 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.507 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.507 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.507 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.507 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.508 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.508 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.508 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.508 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.508 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.509 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.509 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.509 227601 WARNING oslo_config.cfg [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 06 10:00:30 compute-1 nova_compute[227597]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 06 10:00:30 compute-1 nova_compute[227597]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 06 10:00:30 compute-1 nova_compute[227597]: and ``live_migration_inbound_addr`` respectively.
Dec 06 10:00:30 compute-1 nova_compute[227597]: ).  Its value may be silently ignored in the future.
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.509 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.509 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.510 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.510 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.510 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.510 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.510 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.511 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.511 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.511 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.511 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.511 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.511 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.511 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.512 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.512 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.512 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.512 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.512 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rbd_secret_uuid        = 5ecd3f74-dade-5fc4-92ce-8950ae424258 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.513 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.513 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.513 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.513 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.513 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.513 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.513 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.514 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.514 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.514 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.514 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.514 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.514 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.515 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.515 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.515 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.515 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.515 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.515 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.515 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.516 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.516 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.516 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.516 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.516 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.516 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.516 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.517 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.517 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.517 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.517 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.517 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.517 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.517 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.518 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.518 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.518 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.518 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.518 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.518 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.519 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.519 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.519 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.519 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.519 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.519 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.519 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.520 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.520 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.520 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.520 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.520 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.520 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.520 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.521 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.521 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.521 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.521 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.521 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.522 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.522 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.522 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.522 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.522 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.522 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.523 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.523 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.523 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.523 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.523 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.523 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.524 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.524 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.524 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.524 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.524 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.524 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.525 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.525 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.525 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.525 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.525 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.525 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.526 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.526 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.526 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.526 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.526 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.526 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.526 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.527 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.527 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.527 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.527 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.527 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.527 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.528 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.528 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.528 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.528 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.528 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.528 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.528 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.529 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.529 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.529 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.529 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.529 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.529 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.530 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.530 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.530 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.530 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.530 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.530 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.530 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.531 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.531 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.531 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.531 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.531 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.532 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.532 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.532 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.532 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.532 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.532 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.533 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.533 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.533 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.533 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.533 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.533 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.534 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.534 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.534 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.534 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.534 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.535 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.535 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.535 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.535 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.535 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.535 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.536 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.536 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.536 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.536 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.536 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.537 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.537 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.537 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.537 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.537 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.537 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.538 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.538 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.538 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.538 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.538 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.538 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.539 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.539 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.539 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.539 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.539 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.539 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.540 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.540 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.540 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.540 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.540 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.540 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.540 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.541 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.541 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.541 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.541 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.541 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.541 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.542 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.542 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.542 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.542 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.542 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.542 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.542 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.543 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.543 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.543 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.543 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.543 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.543 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.543 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.544 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.544 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.544 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.544 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.544 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.544 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.545 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.545 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.545 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.545 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.545 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.546 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.546 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.546 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.546 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.546 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.546 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.546 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.547 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.547 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.547 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.547 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.547 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.547 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.547 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.548 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.548 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.548 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.548 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.548 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.548 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.548 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.549 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.549 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.549 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.549 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.549 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.550 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.550 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.550 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.550 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.550 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.550 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.551 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.551 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.551 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.551 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.551 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.551 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.551 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.552 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.552 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.552 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.552 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.552 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.552 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.553 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.553 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.553 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.553 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.553 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.553 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.553 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.554 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.554 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.554 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.554 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.554 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.554 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.555 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.555 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.555 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.555 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.555 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.555 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.556 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.556 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.556 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.556 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.556 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.556 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.557 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.557 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.557 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.557 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.557 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.557 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.558 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.558 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.558 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.558 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.558 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.558 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.559 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.559 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.559 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.559 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.559 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.559 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.560 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.560 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.560 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.560 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.560 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.560 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.561 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.561 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.561 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.561 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.561 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.561 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.561 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.562 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.562 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.562 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.562 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.562 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.562 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.563 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.563 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.563 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.563 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.563 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.563 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.564 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.564 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.564 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.564 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.564 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.564 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.565 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.565 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.565 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.565 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.565 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.565 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.566 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.566 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.566 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.566 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.566 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.566 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.566 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.567 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.567 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.567 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.567 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.567 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.567 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.567 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.568 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.568 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.568 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.568 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.568 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.568 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.568 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.569 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.569 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.569 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.569 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.569 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.569 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.569 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.570 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.570 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.570 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.570 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.570 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.570 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.570 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.571 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.571 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.571 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.571 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.571 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.571 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.571 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.572 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.572 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.572 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.572 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.572 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.572 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.573 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.573 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.573 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.573 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.573 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.573 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.574 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.575 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.575 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.575 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.575 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.575 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.575 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.575 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.576 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.576 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.576 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.576 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.576 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.576 227601 DEBUG oslo_service.service [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.578 227601 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.590 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.591 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.591 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.592 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 06 10:00:30 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Dec 06 10:00:30 compute-1 systemd[1]: Started libvirt QEMU daemon.
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.672 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f5c469a0910> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.676 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f5c469a0910> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.677 227601 INFO nova.virt.libvirt.driver [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Connection event '1' reason 'None'
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.699 227601 WARNING nova.virt.libvirt.driver [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Dec 06 10:00:30 compute-1 nova_compute[227597]: 2025-12-06 10:00:30.700 227601 DEBUG nova.virt.libvirt.volume.mount [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 06 10:00:30 compute-1 sudo[228297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zulavjvifdxpcsjuoxatfwipywgtonzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015230.290518-4334-222018028197675/AnsiballZ_podman_container.py'
Dec 06 10:00:30 compute-1 sudo[228297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:00:30 compute-1 podman[228274]: 2025-12-06 10:00:30.939224232 +0000 UTC m=+0.126843398 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Dec 06 10:00:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:31 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4d4001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:31 compute-1 python3.9[228305]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 06 10:00:31 compute-1 sudo[228297]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:31 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 10:00:31 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 10:00:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:31.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.650 227601 INFO nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Libvirt host capabilities <capabilities>
Dec 06 10:00:31 compute-1 nova_compute[227597]: 
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <host>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <uuid>9a5f3f62-e1ed-4c63-8d00-a3c5e56bbddc</uuid>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <cpu>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <arch>x86_64</arch>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model>EPYC-Rome-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <vendor>AMD</vendor>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <microcode version='16777317'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <signature family='23' model='49' stepping='0'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='x2apic'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='tsc-deadline'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='osxsave'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='hypervisor'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='tsc_adjust'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='spec-ctrl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='stibp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='arch-capabilities'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='ssbd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='cmp_legacy'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='topoext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='virt-ssbd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='lbrv'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='tsc-scale'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='vmcb-clean'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='pause-filter'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='pfthreshold'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='svme-addr-chk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='rdctl-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='skip-l1dfl-vmentry'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='mds-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature name='pschange-mc-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <pages unit='KiB' size='4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <pages unit='KiB' size='2048'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <pages unit='KiB' size='1048576'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </cpu>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <power_management>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <suspend_mem/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </power_management>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <iommu support='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <migration_features>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <live/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <uri_transports>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <uri_transport>tcp</uri_transport>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <uri_transport>rdma</uri_transport>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </uri_transports>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </migration_features>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <topology>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <cells num='1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <cell id='0'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:           <memory unit='KiB'>7864312</memory>
Dec 06 10:00:31 compute-1 nova_compute[227597]:           <pages unit='KiB' size='4'>1966078</pages>
Dec 06 10:00:31 compute-1 nova_compute[227597]:           <pages unit='KiB' size='2048'>0</pages>
Dec 06 10:00:31 compute-1 nova_compute[227597]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 06 10:00:31 compute-1 nova_compute[227597]:           <distances>
Dec 06 10:00:31 compute-1 nova_compute[227597]:             <sibling id='0' value='10'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:           </distances>
Dec 06 10:00:31 compute-1 nova_compute[227597]:           <cpus num='8'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:           </cpus>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         </cell>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </cells>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </topology>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <cache>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </cache>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <secmodel>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model>selinux</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <doi>0</doi>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </secmodel>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <secmodel>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model>dac</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <doi>0</doi>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </secmodel>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </host>
Dec 06 10:00:31 compute-1 nova_compute[227597]: 
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <guest>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <os_type>hvm</os_type>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <arch name='i686'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <wordsize>32</wordsize>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <domain type='qemu'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <domain type='kvm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </arch>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <features>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <pae/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <nonpae/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <acpi default='on' toggle='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <apic default='on' toggle='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <cpuselection/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <deviceboot/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <disksnapshot default='on' toggle='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <externalSnapshot/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </features>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </guest>
Dec 06 10:00:31 compute-1 nova_compute[227597]: 
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <guest>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <os_type>hvm</os_type>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <arch name='x86_64'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <wordsize>64</wordsize>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <domain type='qemu'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <domain type='kvm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </arch>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <features>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <acpi default='on' toggle='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <apic default='on' toggle='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <cpuselection/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <deviceboot/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <disksnapshot default='on' toggle='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <externalSnapshot/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </features>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </guest>
Dec 06 10:00:31 compute-1 nova_compute[227597]: 
Dec 06 10:00:31 compute-1 nova_compute[227597]: </capabilities>
Dec 06 10:00:31 compute-1 nova_compute[227597]: 
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.657 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.678 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 06 10:00:31 compute-1 nova_compute[227597]: <domainCapabilities>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <domain>kvm</domain>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <arch>i686</arch>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <vcpu max='240'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <iothreads supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <os supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <enum name='firmware'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <loader supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>rom</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pflash</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='readonly'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>yes</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>no</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='secure'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>no</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </loader>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </os>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <cpu>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='host-passthrough' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='hostPassthroughMigratable'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>on</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>off</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='maximum' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='maximumMigratable'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>on</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>off</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='host-model' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <vendor>AMD</vendor>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='x2apic'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='hypervisor'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='stibp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='ssbd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='overflow-recov'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='succor'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='ibrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='lbrv'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='tsc-scale'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='flushbyasid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='pause-filter'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='pfthreshold'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='disable' name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='custom' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cooperlake'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cooperlake-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cooperlake-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Dhyana-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Genoa'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amd-psfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='auto-ibrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='stibp-always-on'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amd-psfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='auto-ibrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='stibp-always-on'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Milan'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Milan-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Milan-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amd-psfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='stibp-always-on'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='GraniteRapids'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='prefetchiti'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='GraniteRapids-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='prefetchiti'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='GraniteRapids-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10-128'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10-256'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10-512'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='prefetchiti'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v6'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v7'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='KnightsMill'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512er'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512pf'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='KnightsMill-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512er'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512pf'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G4-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tbm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G5-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tbm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SierraForest'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cmpccxadd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SierraForest-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cmpccxadd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 ceph-mon[79770]: pgmap v570: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='athlon'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='athlon-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='core2duo'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='core2duo-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='coreduo'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='coreduo-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='n270'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='n270-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='phenom'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='phenom-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </cpu>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <memoryBacking supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <enum name='sourceType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>file</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>anonymous</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>memfd</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </memoryBacking>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <devices>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <disk supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='diskDevice'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>disk</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>cdrom</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>floppy</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>lun</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='bus'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>ide</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>fdc</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>scsi</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>usb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>sata</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-non-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </disk>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <graphics supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vnc</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>egl-headless</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>dbus</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </graphics>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <video supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='modelType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vga</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>cirrus</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>none</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>bochs</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>ramfb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </video>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <hostdev supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='mode'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>subsystem</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='startupPolicy'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>default</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>mandatory</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>requisite</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>optional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='subsysType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>usb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pci</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>scsi</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='capsType'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='pciBackend'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </hostdev>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <rng supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-non-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendModel'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>random</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>egd</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>builtin</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </rng>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <filesystem supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='driverType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>path</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>handle</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtiofs</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </filesystem>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <tpm supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tpm-tis</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tpm-crb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendModel'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>emulator</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>external</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendVersion'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>2.0</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </tpm>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <redirdev supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='bus'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>usb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </redirdev>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <channel supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pty</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>unix</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </channel>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <crypto supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>qemu</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendModel'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>builtin</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </crypto>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <interface supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>default</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>passt</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </interface>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <panic supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>isa</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>hyperv</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </panic>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <console supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>null</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vc</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pty</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>dev</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>file</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pipe</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>stdio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>udp</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tcp</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>unix</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>qemu-vdagent</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>dbus</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </console>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </devices>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <features>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <gic supported='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <vmcoreinfo supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <genid supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <backingStoreInput supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <backup supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <async-teardown supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <ps2 supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <sev supported='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <sgx supported='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <hyperv supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='features'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>relaxed</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vapic</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>spinlocks</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vpindex</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>runtime</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>synic</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>stimer</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>reset</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vendor_id</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>frequencies</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>reenlightenment</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tlbflush</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>ipi</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>avic</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>emsr_bitmap</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>xmm_input</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <defaults>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <spinlocks>4095</spinlocks>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <stimer_direct>on</stimer_direct>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <tlbflush_direct>on</tlbflush_direct>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <tlbflush_extended>on</tlbflush_extended>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </defaults>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </hyperv>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <launchSecurity supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='sectype'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tdx</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </launchSecurity>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </features>
Dec 06 10:00:31 compute-1 nova_compute[227597]: </domainCapabilities>
Dec 06 10:00:31 compute-1 nova_compute[227597]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.685 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 06 10:00:31 compute-1 nova_compute[227597]: <domainCapabilities>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <domain>kvm</domain>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <arch>i686</arch>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <vcpu max='4096'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <iothreads supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <os supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <enum name='firmware'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <loader supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>rom</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pflash</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='readonly'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>yes</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>no</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='secure'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>no</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </loader>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </os>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <cpu>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='host-passthrough' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='hostPassthroughMigratable'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>on</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>off</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='maximum' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='maximumMigratable'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>on</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>off</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='host-model' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <vendor>AMD</vendor>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='x2apic'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='hypervisor'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='stibp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='ssbd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='overflow-recov'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='succor'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='ibrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='lbrv'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='tsc-scale'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='flushbyasid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='pause-filter'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='pfthreshold'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='disable' name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='custom' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:31.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cooperlake'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cooperlake-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cooperlake-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Dhyana-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Genoa'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amd-psfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='auto-ibrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='stibp-always-on'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amd-psfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='auto-ibrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='stibp-always-on'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Milan'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Milan-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Milan-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amd-psfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='stibp-always-on'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='GraniteRapids'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='prefetchiti'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='GraniteRapids-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='prefetchiti'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='GraniteRapids-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10-128'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10-256'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10-512'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='prefetchiti'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v6'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v7'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='KnightsMill'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512er'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512pf'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='KnightsMill-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512er'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512pf'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G4-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tbm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G5-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tbm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SierraForest'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cmpccxadd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SierraForest-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cmpccxadd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='athlon'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='athlon-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='core2duo'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='core2duo-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='coreduo'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='coreduo-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='n270'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='n270-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='phenom'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='phenom-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </cpu>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <memoryBacking supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <enum name='sourceType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>file</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>anonymous</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>memfd</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </memoryBacking>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <devices>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <disk supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='diskDevice'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>disk</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>cdrom</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>floppy</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>lun</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='bus'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>fdc</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>scsi</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>usb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>sata</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-non-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </disk>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <graphics supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vnc</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>egl-headless</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>dbus</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </graphics>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <video supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='modelType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vga</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>cirrus</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>none</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>bochs</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>ramfb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </video>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <hostdev supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='mode'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>subsystem</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='startupPolicy'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>default</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>mandatory</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>requisite</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>optional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='subsysType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>usb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pci</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>scsi</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='capsType'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='pciBackend'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </hostdev>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <rng supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-non-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendModel'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>random</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>egd</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>builtin</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </rng>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <filesystem supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='driverType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>path</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>handle</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtiofs</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </filesystem>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <tpm supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tpm-tis</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tpm-crb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendModel'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>emulator</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>external</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendVersion'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>2.0</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </tpm>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <redirdev supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='bus'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>usb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </redirdev>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <channel supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pty</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>unix</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </channel>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <crypto supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>qemu</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendModel'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>builtin</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </crypto>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <interface supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>default</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>passt</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </interface>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <panic supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>isa</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>hyperv</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </panic>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <console supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>null</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vc</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pty</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>dev</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>file</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pipe</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>stdio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>udp</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tcp</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>unix</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>qemu-vdagent</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>dbus</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </console>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </devices>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <features>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <gic supported='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <vmcoreinfo supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <genid supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <backingStoreInput supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <backup supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <async-teardown supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <ps2 supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <sev supported='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <sgx supported='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <hyperv supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='features'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>relaxed</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vapic</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>spinlocks</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vpindex</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>runtime</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>synic</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>stimer</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>reset</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vendor_id</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>frequencies</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>reenlightenment</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tlbflush</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>ipi</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>avic</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>emsr_bitmap</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>xmm_input</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <defaults>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <spinlocks>4095</spinlocks>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <stimer_direct>on</stimer_direct>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <tlbflush_direct>on</tlbflush_direct>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <tlbflush_extended>on</tlbflush_extended>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </defaults>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </hyperv>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <launchSecurity supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='sectype'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tdx</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </launchSecurity>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </features>
Dec 06 10:00:31 compute-1 nova_compute[227597]: </domainCapabilities>
Dec 06 10:00:31 compute-1 nova_compute[227597]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.713 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.718 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 06 10:00:31 compute-1 nova_compute[227597]: <domainCapabilities>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <domain>kvm</domain>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <arch>x86_64</arch>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <vcpu max='240'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <iothreads supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <os supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <enum name='firmware'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <loader supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>rom</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pflash</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='readonly'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>yes</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>no</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='secure'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>no</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </loader>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </os>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <cpu>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='host-passthrough' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='hostPassthroughMigratable'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>on</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>off</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='maximum' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='maximumMigratable'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>on</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>off</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='host-model' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <vendor>AMD</vendor>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='x2apic'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='hypervisor'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='stibp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='ssbd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='overflow-recov'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='succor'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='ibrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='lbrv'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='tsc-scale'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='flushbyasid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='pause-filter'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='pfthreshold'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='disable' name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='custom' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cooperlake'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cooperlake-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cooperlake-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Dhyana-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Genoa'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amd-psfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='auto-ibrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='stibp-always-on'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amd-psfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='auto-ibrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='stibp-always-on'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Milan'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Milan-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Milan-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amd-psfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='stibp-always-on'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='GraniteRapids'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='prefetchiti'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='GraniteRapids-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='prefetchiti'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='GraniteRapids-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10-128'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10-256'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10-512'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='prefetchiti'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v6'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v7'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='KnightsMill'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512er'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512pf'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='KnightsMill-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512er'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512pf'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G4-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tbm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G5-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tbm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SierraForest'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cmpccxadd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SierraForest-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cmpccxadd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='athlon'>
Dec 06 10:00:31 compute-1 sudo[228496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgtgevsdxgukpnmptsjmwfgbzpyfeleq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015231.499367-4359-276268117928285/AnsiballZ_systemd.py'
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='athlon-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='core2duo'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='core2duo-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='coreduo'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='coreduo-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='n270'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='n270-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='phenom'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='phenom-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </cpu>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <memoryBacking supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <enum name='sourceType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>file</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>anonymous</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>memfd</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </memoryBacking>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <devices>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <disk supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='diskDevice'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>disk</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>cdrom</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>floppy</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>lun</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='bus'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>ide</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>fdc</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>scsi</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>usb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>sata</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-non-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </disk>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <graphics supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vnc</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>egl-headless</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>dbus</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </graphics>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <video supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='modelType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vga</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>cirrus</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>none</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>bochs</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>ramfb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </video>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <hostdev supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='mode'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>subsystem</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='startupPolicy'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>default</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>mandatory</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>requisite</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>optional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='subsysType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>usb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pci</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>scsi</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='capsType'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='pciBackend'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </hostdev>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <rng supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-non-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendModel'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>random</value>
Dec 06 10:00:31 compute-1 sudo[228496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>egd</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>builtin</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </rng>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <filesystem supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='driverType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>path</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>handle</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtiofs</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </filesystem>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <tpm supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tpm-tis</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tpm-crb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendModel'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>emulator</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>external</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendVersion'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>2.0</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </tpm>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <redirdev supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='bus'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>usb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </redirdev>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <channel supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pty</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>unix</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </channel>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <crypto supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>qemu</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendModel'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>builtin</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </crypto>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <interface supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>default</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>passt</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </interface>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <panic supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>isa</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>hyperv</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </panic>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <console supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>null</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vc</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pty</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>dev</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>file</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pipe</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>stdio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>udp</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tcp</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>unix</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>qemu-vdagent</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>dbus</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </console>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </devices>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <features>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <gic supported='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <vmcoreinfo supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <genid supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <backingStoreInput supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <backup supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <async-teardown supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <ps2 supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <sev supported='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <sgx supported='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <hyperv supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='features'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>relaxed</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vapic</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>spinlocks</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vpindex</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>runtime</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>synic</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>stimer</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>reset</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vendor_id</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>frequencies</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>reenlightenment</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tlbflush</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>ipi</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>avic</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>emsr_bitmap</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>xmm_input</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <defaults>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <spinlocks>4095</spinlocks>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <stimer_direct>on</stimer_direct>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <tlbflush_direct>on</tlbflush_direct>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <tlbflush_extended>on</tlbflush_extended>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </defaults>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </hyperv>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <launchSecurity supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='sectype'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tdx</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </launchSecurity>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </features>
Dec 06 10:00:31 compute-1 nova_compute[227597]: </domainCapabilities>
Dec 06 10:00:31 compute-1 nova_compute[227597]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.782 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 06 10:00:31 compute-1 nova_compute[227597]: <domainCapabilities>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <domain>kvm</domain>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <arch>x86_64</arch>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <vcpu max='4096'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <iothreads supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <os supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <enum name='firmware'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>efi</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <loader supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>rom</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pflash</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='readonly'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>yes</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>no</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='secure'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>yes</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>no</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </loader>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </os>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <cpu>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='host-passthrough' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='hostPassthroughMigratable'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>on</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>off</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='maximum' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='maximumMigratable'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>on</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>off</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='host-model' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <vendor>AMD</vendor>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='x2apic'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='hypervisor'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='stibp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='ssbd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='overflow-recov'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='succor'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='ibrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='lbrv'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='tsc-scale'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='flushbyasid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='pause-filter'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='pfthreshold'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <feature policy='disable' name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <mode name='custom' supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Broadwell-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cooperlake'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cooperlake-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Cooperlake-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Denverton-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Dhyana-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Genoa'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amd-psfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='auto-ibrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='stibp-always-on'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amd-psfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='auto-ibrs'/>
Dec 06 10:00:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:31 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='stibp-always-on'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Milan'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Milan-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Milan-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amd-psfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='stibp-always-on'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-Rome-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='EPYC-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='GraniteRapids'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='prefetchiti'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='GraniteRapids-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='prefetchiti'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='GraniteRapids-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10-128'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10-256'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx10-512'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='prefetchiti'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Haswell-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v6'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Icelake-Server-v7'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='IvyBridge-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='KnightsMill'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512er'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512pf'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='KnightsMill-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512er'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512pf'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G4-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tbm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Opteron_G5-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fma4'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tbm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xop'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SapphireRapids-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='amx-tile'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-bf16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-fp16'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bitalg'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrc'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fzrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='la57'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='taa-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xfd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SierraForest'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cmpccxadd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='SierraForest-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ifma'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cmpccxadd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fbsdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='fsrs'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ibrs-all'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mcdt-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pbrsb-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='psdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='serialize'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vaes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Client-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='hle'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='rtm'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Skylake-Server-v5'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512bw'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512cd'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512dq'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512f'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='avx512vl'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='invpcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pcid'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='pku'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='mpx'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v2'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v3'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='core-capability'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='split-lock-detect'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='Snowridge-v4'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='cldemote'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='erms'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='gfni'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdir64b'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='movdiri'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='xsaves'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='athlon'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='athlon-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='core2duo'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='core2duo-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='coreduo'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='coreduo-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='n270'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='n270-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='ss'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='phenom'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <blockers model='phenom-v1'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnow'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <feature name='3dnowext'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </blockers>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </mode>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </cpu>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <memoryBacking supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <enum name='sourceType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>file</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>anonymous</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <value>memfd</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </memoryBacking>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <devices>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <disk supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='diskDevice'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>disk</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>cdrom</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>floppy</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>lun</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='bus'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>fdc</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>scsi</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>usb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>sata</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-non-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </disk>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <graphics supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vnc</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>egl-headless</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>dbus</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </graphics>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <video supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='modelType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vga</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>cirrus</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>none</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>bochs</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>ramfb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </video>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <hostdev supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='mode'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>subsystem</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='startupPolicy'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>default</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>mandatory</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>requisite</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>optional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='subsysType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>usb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pci</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>scsi</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='capsType'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='pciBackend'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </hostdev>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <rng supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtio-non-transitional</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendModel'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>random</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>egd</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>builtin</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </rng>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <filesystem supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='driverType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>path</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>handle</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>virtiofs</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </filesystem>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <tpm supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tpm-tis</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tpm-crb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendModel'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>emulator</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>external</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendVersion'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>2.0</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </tpm>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <redirdev supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='bus'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>usb</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </redirdev>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <channel supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pty</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>unix</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </channel>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <crypto supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>qemu</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendModel'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>builtin</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </crypto>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <interface supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='backendType'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>default</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>passt</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </interface>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <panic supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='model'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>isa</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>hyperv</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </panic>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <console supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='type'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>null</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vc</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pty</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>dev</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>file</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>pipe</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>stdio</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>udp</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tcp</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>unix</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>qemu-vdagent</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>dbus</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </console>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </devices>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   <features>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <gic supported='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <vmcoreinfo supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <genid supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <backingStoreInput supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <backup supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <async-teardown supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <ps2 supported='yes'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <sev supported='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <sgx supported='no'/>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <hyperv supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='features'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>relaxed</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vapic</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>spinlocks</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vpindex</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>runtime</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>synic</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>stimer</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>reset</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>vendor_id</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>frequencies</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>reenlightenment</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tlbflush</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>ipi</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>avic</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>emsr_bitmap</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>xmm_input</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <defaults>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <spinlocks>4095</spinlocks>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <stimer_direct>on</stimer_direct>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <tlbflush_direct>on</tlbflush_direct>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <tlbflush_extended>on</tlbflush_extended>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </defaults>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </hyperv>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     <launchSecurity supported='yes'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       <enum name='sectype'>
Dec 06 10:00:31 compute-1 nova_compute[227597]:         <value>tdx</value>
Dec 06 10:00:31 compute-1 nova_compute[227597]:       </enum>
Dec 06 10:00:31 compute-1 nova_compute[227597]:     </launchSecurity>
Dec 06 10:00:31 compute-1 nova_compute[227597]:   </features>
Dec 06 10:00:31 compute-1 nova_compute[227597]: </domainCapabilities>
Dec 06 10:00:31 compute-1 nova_compute[227597]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.855 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.855 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.856 227601 DEBUG nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.856 227601 INFO nova.virt.libvirt.host [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Secure Boot support detected
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.867 227601 INFO nova.virt.libvirt.driver [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.867 227601 INFO nova.virt.libvirt.driver [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.885 227601 DEBUG nova.virt.libvirt.driver [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.907 227601 INFO nova.virt.node [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Determined node identity ff2f17cb-ff1d-4da7-9560-4be741380cb1 from /var/lib/nova/compute_id
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.919 227601 WARNING nova.compute.manager [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Compute nodes ['ff2f17cb-ff1d-4da7-9560-4be741380cb1'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.940 227601 INFO nova.compute.manager [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 06 10:00:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:31 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.965 227601 WARNING nova.compute.manager [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.965 227601 DEBUG oslo_concurrency.lockutils [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.966 227601 DEBUG oslo_concurrency.lockutils [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.966 227601 DEBUG oslo_concurrency.lockutils [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.966 227601 DEBUG nova.compute.resource_tracker [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:00:31 compute-1 nova_compute[227597]: 2025-12-06 10:00:31.967 227601 DEBUG oslo_concurrency.processutils [None req-cea1e880-d63b-4a0b-8d20-816526471ae0 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:00:32 compute-1 python3.9[228498]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 06 10:00:32 compute-1 systemd[1]: Stopping nova_compute container...
Dec 06 10:00:32 compute-1 nova_compute[227597]: 2025-12-06 10:00:32.312 227601 DEBUG oslo_concurrency.lockutils [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:00:32 compute-1 nova_compute[227597]: 2025-12-06 10:00:32.312 227601 DEBUG oslo_concurrency.lockutils [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:00:32 compute-1 nova_compute[227597]: 2025-12-06 10:00:32.313 227601 DEBUG oslo_concurrency.lockutils [None req-5191dc73-2969-4fd9-8fc3-d6dd743c4e12 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:00:32 compute-1 virtqemud[228188]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 06 10:00:32 compute-1 virtqemud[228188]: hostname: compute-1
Dec 06 10:00:32 compute-1 virtqemud[228188]: End of file while reading data: Input/output error
Dec 06 10:00:32 compute-1 systemd[1]: libpod-fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba.scope: Deactivated successfully.
Dec 06 10:00:32 compute-1 systemd[1]: libpod-fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba.scope: Consumed 4.076s CPU time.
Dec 06 10:00:32 compute-1 podman[228522]: 2025-12-06 10:00:32.930521627 +0000 UTC m=+0.669463731 container died fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm)
Dec 06 10:00:32 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba-userdata-shm.mount: Deactivated successfully.
Dec 06 10:00:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3-merged.mount: Deactivated successfully.
Dec 06 10:00:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:33 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:00:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:33.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:00:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:33.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:00:33 compute-1 podman[228522]: 2025-12-06 10:00:33.781511201 +0000 UTC m=+1.520453275 container cleanup fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 10:00:33 compute-1 podman[228522]: nova_compute
Dec 06 10:00:33 compute-1 podman[228547]: nova_compute
Dec 06 10:00:33 compute-1 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 06 10:00:33 compute-1 systemd[1]: Stopped nova_compute container.
Dec 06 10:00:33 compute-1 systemd[1]: Starting nova_compute container...
Dec 06 10:00:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:33 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc0029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:33 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:33 compute-1 systemd[1]: Started libcrun container.
Dec 06 10:00:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 06 10:00:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 06 10:00:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 10:00:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 06 10:00:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe42b0bad23ec200123940026b03e1add6e1efb6ab3f7ea1cab6155db02ec0c3/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 06 10:00:33 compute-1 podman[228560]: 2025-12-06 10:00:33.991919396 +0000 UTC m=+0.108064673 container init fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm)
Dec 06 10:00:34 compute-1 podman[228560]: 2025-12-06 10:00:34.005508885 +0000 UTC m=+0.121654112 container start fb9b5b61d2ca2446a0d64f5cf1f50538f7d4513378afdaeab81ced8fe95510ba (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 06 10:00:34 compute-1 podman[228560]: nova_compute
Dec 06 10:00:34 compute-1 nova_compute[228576]: + sudo -E kolla_set_configs
Dec 06 10:00:34 compute-1 systemd[1]: Started nova_compute container.
Dec 06 10:00:34 compute-1 ceph-mon[79770]: pgmap v571: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:00:34 compute-1 sudo[228496]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Validating config file
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Copying service configuration files
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Deleting /etc/ceph
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Creating directory /etc/ceph
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /etc/ceph
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Writing out command to execute
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 06 10:00:34 compute-1 nova_compute[228576]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 06 10:00:34 compute-1 nova_compute[228576]: ++ cat /run_command
Dec 06 10:00:34 compute-1 nova_compute[228576]: + CMD=nova-compute
Dec 06 10:00:34 compute-1 nova_compute[228576]: + ARGS=
Dec 06 10:00:34 compute-1 nova_compute[228576]: + sudo kolla_copy_cacerts
Dec 06 10:00:34 compute-1 nova_compute[228576]: + [[ ! -n '' ]]
Dec 06 10:00:34 compute-1 nova_compute[228576]: + . kolla_extend_start
Dec 06 10:00:34 compute-1 nova_compute[228576]: Running command: 'nova-compute'
Dec 06 10:00:34 compute-1 nova_compute[228576]: + echo 'Running command: '\''nova-compute'\'''
Dec 06 10:00:34 compute-1 nova_compute[228576]: + umask 0022
Dec 06 10:00:34 compute-1 nova_compute[228576]: + exec nova-compute
Dec 06 10:00:34 compute-1 sudo[228738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvicblbbioqmhlfvtoqvmzamhphjxswe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765015234.508352-4385-148815317428775/AnsiballZ_podman_container.py'
Dec 06 10:00:34 compute-1 sudo[228738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:00:34 compute-1 podman[228740]: 2025-12-06 10:00:34.973131488 +0000 UTC m=+0.090427744 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd)
Dec 06 10:00:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:35 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:35 compute-1 ceph-mon[79770]: pgmap v572: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 10:00:35 compute-1 python3.9[228741]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 06 10:00:35 compute-1 systemd[1]: Started libpod-conmon-5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2.scope.
Dec 06 10:00:35 compute-1 systemd[1]: Started libcrun container.
Dec 06 10:00:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69877ef447b98042ebc90a4e4357f2beb92c0f4ec7b168baeca8fa88d64b6a81/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 06 10:00:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69877ef447b98042ebc90a4e4357f2beb92c0f4ec7b168baeca8fa88d64b6a81/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 06 10:00:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69877ef447b98042ebc90a4e4357f2beb92c0f4ec7b168baeca8fa88d64b6a81/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 06 10:00:35 compute-1 podman[228782]: 2025-12-06 10:00:35.410414495 +0000 UTC m=+0.135462066 container init 5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute_init, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Dec 06 10:00:35 compute-1 podman[228782]: 2025-12-06 10:00:35.419585618 +0000 UTC m=+0.144633199 container start 5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:00:35 compute-1 python3.9[228741]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 06 10:00:35 compute-1 nova_compute_init[228804]: INFO:nova_statedir:Applying nova statedir ownership
Dec 06 10:00:35 compute-1 nova_compute_init[228804]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 06 10:00:35 compute-1 nova_compute_init[228804]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 06 10:00:35 compute-1 nova_compute_init[228804]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 06 10:00:35 compute-1 nova_compute_init[228804]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 06 10:00:35 compute-1 nova_compute_init[228804]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 06 10:00:35 compute-1 nova_compute_init[228804]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 06 10:00:35 compute-1 nova_compute_init[228804]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 06 10:00:35 compute-1 nova_compute_init[228804]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 06 10:00:35 compute-1 nova_compute_init[228804]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 06 10:00:35 compute-1 nova_compute_init[228804]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 06 10:00:35 compute-1 nova_compute_init[228804]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 06 10:00:35 compute-1 nova_compute_init[228804]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 06 10:00:35 compute-1 nova_compute_init[228804]: INFO:nova_statedir:Nova statedir ownership complete
Dec 06 10:00:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:35.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:35 compute-1 systemd[1]: libpod-5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2.scope: Deactivated successfully.
Dec 06 10:00:35 compute-1 podman[228805]: 2025-12-06 10:00:35.508951036 +0000 UTC m=+0.044594303 container died 5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:00:35 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2-userdata-shm.mount: Deactivated successfully.
Dec 06 10:00:35 compute-1 systemd[1]: var-lib-containers-storage-overlay-69877ef447b98042ebc90a4e4357f2beb92c0f4ec7b168baeca8fa88d64b6a81-merged.mount: Deactivated successfully.
Dec 06 10:00:35 compute-1 podman[228816]: 2025-12-06 10:00:35.585137544 +0000 UTC m=+0.067513339 container cleanup 5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute_init, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:00:35 compute-1 systemd[1]: libpod-conmon-5870135d94bf035757c1e7c1605be69377e8a248cd872b4fc5e7fa9268e794c2.scope: Deactivated successfully.
Dec 06 10:00:35 compute-1 sudo[228738]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:35.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:35 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:35 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc0029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:36 compute-1 sshd-session[199382]: Connection closed by 192.168.122.30 port 40024
Dec 06 10:00:36 compute-1 sshd-session[199379]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:00:36 compute-1 systemd[1]: session-53.scope: Deactivated successfully.
Dec 06 10:00:36 compute-1 systemd[1]: session-53.scope: Consumed 2min 25.291s CPU time.
Dec 06 10:00:36 compute-1 systemd-logind[788]: Session 53 logged out. Waiting for processes to exit.
Dec 06 10:00:36 compute-1 systemd-logind[788]: Removed session 53.
Dec 06 10:00:36 compute-1 nova_compute[228576]: 2025-12-06 10:00:36.253 228580 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 10:00:36 compute-1 nova_compute[228576]: 2025-12-06 10:00:36.254 228580 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 10:00:36 compute-1 nova_compute[228576]: 2025-12-06 10:00:36.254 228580 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 06 10:00:36 compute-1 nova_compute[228576]: 2025-12-06 10:00:36.254 228580 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 06 10:00:36 compute-1 nova_compute[228576]: 2025-12-06 10:00:36.442 228580 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:00:36 compute-1 nova_compute[228576]: 2025-12-06 10:00:36.456 228580 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:00:36 compute-1 nova_compute[228576]: 2025-12-06 10:00:36.457 228580 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 06 10:00:36 compute-1 nova_compute[228576]: 2025-12-06 10:00:36.927 228580 INFO nova.virt.driver [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 06 10:00:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:37 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.075 228580 INFO nova.compute.provider_config [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.084 228580 DEBUG oslo_concurrency.lockutils [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.084 228580 DEBUG oslo_concurrency.lockutils [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.084 228580 DEBUG oslo_concurrency.lockutils [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.085 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.085 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.085 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.085 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.085 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.085 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.086 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.086 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.086 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.086 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.086 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.086 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.087 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.087 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.087 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.087 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.087 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.087 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.087 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.088 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.088 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.088 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.088 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.088 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.088 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.088 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.089 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.089 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.089 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.089 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.089 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.089 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.090 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.090 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.090 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.090 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.090 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.090 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.091 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.091 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.091 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.091 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.091 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.091 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.091 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.092 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.092 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.092 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.092 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.092 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.092 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.093 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.093 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.093 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.093 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.093 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.093 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.093 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.094 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.094 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.094 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.094 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.094 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.094 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.094 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.095 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.095 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.095 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.095 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.095 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.095 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.095 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.096 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.096 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.096 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.096 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.096 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.097 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.097 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.097 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.097 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.097 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.097 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.098 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.098 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.098 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.098 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.098 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.098 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.099 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.099 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.099 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.099 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.099 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.099 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.100 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.100 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.100 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.100 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.100 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.101 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.101 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.101 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.101 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.101 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.101 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.101 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.102 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.102 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.102 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.102 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.102 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.102 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.103 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.103 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.103 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.103 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.103 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.103 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.104 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.104 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.104 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.104 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.104 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.104 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.104 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.105 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.105 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.105 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.105 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.105 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.105 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.105 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.106 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.106 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.106 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.106 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.106 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.106 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.107 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.107 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.107 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.107 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.107 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.107 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.107 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.108 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.108 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.108 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.108 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.108 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.109 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.109 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.109 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.109 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.109 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.109 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.109 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.110 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.110 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.110 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.110 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.110 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.110 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.110 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.111 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.111 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.111 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.111 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.111 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.111 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.112 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.112 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.112 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.112 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.112 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.112 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.113 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.113 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.113 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.113 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.113 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.114 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.114 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.114 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.114 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.114 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.114 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.114 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.115 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.115 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.115 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.115 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.115 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.115 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.115 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.116 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.116 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.116 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.116 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.116 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.116 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.117 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.117 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.117 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.117 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.117 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.117 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.117 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.118 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.118 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.118 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.118 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.118 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.118 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.119 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.119 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.119 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.119 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.119 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.120 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.120 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.120 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.120 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.120 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.121 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.121 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.121 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.121 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.121 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.121 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.121 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.122 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.122 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.122 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.122 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.122 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.123 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.123 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.123 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.123 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.123 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.124 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.124 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.124 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.124 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.124 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.125 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.125 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.125 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.125 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.125 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.126 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.126 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.126 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.126 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.126 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.126 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.127 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.127 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.127 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.127 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.127 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.128 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.128 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.128 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.128 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.128 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.129 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.129 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.129 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.129 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.129 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.130 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.130 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.130 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.130 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.130 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.130 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.130 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.131 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.131 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.131 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.131 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.131 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.131 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.131 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.132 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.132 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.132 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.132 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.132 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.132 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.133 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.133 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.133 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.133 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.133 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.133 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.133 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.134 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.134 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.134 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.134 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.134 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.134 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.135 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.135 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.135 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.135 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.135 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.135 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.135 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.136 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.136 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.136 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.136 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.136 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.136 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.137 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.137 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.137 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.137 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.137 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.137 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.137 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.138 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.138 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.138 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.138 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.138 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.138 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.138 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.139 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.139 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.139 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.139 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.139 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.139 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.139 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.140 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.140 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.140 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.140 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.140 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.140 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.141 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.141 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.141 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.141 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.141 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.141 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.141 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.142 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.142 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.142 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.142 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.142 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.143 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.143 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.143 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.143 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.143 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.143 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.143 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.144 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.144 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.144 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.144 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.144 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.144 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.145 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.145 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.145 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.145 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.145 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.145 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.145 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.146 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.146 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.146 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.146 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.146 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.146 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.146 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.147 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.147 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.147 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.147 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.147 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.147 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.148 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.148 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.148 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.148 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.149 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.149 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.149 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.149 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.149 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.149 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.149 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.150 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.150 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.150 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.150 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.150 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.150 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.150 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.151 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.151 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.151 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.151 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.151 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.151 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.151 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.152 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.152 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.152 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.152 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.152 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.152 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.152 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.153 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.153 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.153 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.153 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.153 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.153 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.153 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.154 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.154 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.154 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.154 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.154 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.154 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.154 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.155 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.155 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.155 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.155 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.155 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.155 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.155 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.156 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.156 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.156 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.156 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.156 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.156 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.156 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.157 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.157 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.157 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.157 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.157 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.157 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.157 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.158 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.158 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.158 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.158 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.158 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.158 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.158 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.159 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.159 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.159 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.159 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.159 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.159 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.160 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.160 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.160 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.160 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.160 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.160 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.161 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.161 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.161 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.161 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.161 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.161 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.162 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.162 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.162 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.162 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.162 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.162 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.163 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.163 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.163 228580 WARNING oslo_config.cfg [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 06 10:00:37 compute-1 nova_compute[228576]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 06 10:00:37 compute-1 nova_compute[228576]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 06 10:00:37 compute-1 nova_compute[228576]: and ``live_migration_inbound_addr`` respectively.
Dec 06 10:00:37 compute-1 nova_compute[228576]: ).  Its value may be silently ignored in the future.
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.163 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.163 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.164 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.164 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.164 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.164 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.164 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.164 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.165 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.165 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.165 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.165 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.165 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.165 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.165 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.166 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.166 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.166 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.166 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rbd_secret_uuid        = 5ecd3f74-dade-5fc4-92ce-8950ae424258 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.166 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.166 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.167 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.167 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.167 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.167 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.167 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.167 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.167 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.168 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.168 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.168 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.168 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.168 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.168 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.169 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.169 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.169 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.169 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.169 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.169 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.169 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.170 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.170 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.170 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.170 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.170 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.170 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.171 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.171 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.171 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.171 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.171 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.171 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.172 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.172 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.172 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.172 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.172 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.172 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.173 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.173 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.173 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.173 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.173 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.173 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.173 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.174 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.174 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.174 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.174 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.174 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.174 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.174 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.175 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.175 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.175 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.175 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.175 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.176 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.176 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.176 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.176 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.176 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.177 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.177 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.177 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.177 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.177 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.177 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.177 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.178 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.178 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.178 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.178 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.178 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.178 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.179 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.179 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.179 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.179 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.179 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.179 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.179 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.180 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.180 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.180 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.180 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.180 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.180 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.180 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.181 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.181 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.181 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.181 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.181 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.181 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.181 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.182 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.182 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.182 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.182 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.182 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.182 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.182 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.183 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.183 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.183 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.183 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.183 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.183 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.183 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.184 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.184 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.184 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.184 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.184 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.184 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.185 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.185 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.185 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.185 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.185 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.185 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.186 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.186 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.186 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.186 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.186 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.186 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.186 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.187 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.187 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.187 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.187 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.187 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.187 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.187 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.188 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.188 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.188 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.188 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.188 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.188 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.188 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.189 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.189 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.189 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.189 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.189 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.189 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.189 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.190 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.190 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.190 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.190 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.190 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.190 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.191 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.191 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.191 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.191 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.191 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.191 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.192 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.192 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.192 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.192 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.192 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.192 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.192 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.193 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.193 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.193 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.193 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.193 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.193 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.193 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.194 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.194 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.194 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.194 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.194 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.194 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.195 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.195 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.195 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.195 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.195 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.195 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.195 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.196 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.196 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.196 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.196 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.196 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.196 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.196 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.197 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.197 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.197 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.197 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.197 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.197 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.198 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.198 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.198 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.198 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.198 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.198 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.199 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.199 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.199 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.199 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.199 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.200 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.200 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.200 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.200 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.200 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.200 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.201 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.201 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.201 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.201 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.201 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.202 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.202 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.202 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.202 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.203 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.203 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.203 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.203 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.203 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.203 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.204 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.204 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.204 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.204 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.204 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.204 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.205 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.205 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.205 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.205 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.205 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.206 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.206 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.206 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.206 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.206 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.206 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.207 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.207 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.207 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.207 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.207 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.207 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.207 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.208 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.208 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.208 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.208 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.208 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.208 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.209 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.209 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.209 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.209 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.209 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.209 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.209 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.210 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.210 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.210 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.210 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.210 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.210 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.211 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.211 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.211 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.211 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.211 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.212 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.212 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.212 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.212 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.212 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.212 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.213 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.213 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.213 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.213 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.213 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.213 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.214 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.214 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.214 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.214 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.214 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.214 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.214 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.215 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.215 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.215 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.215 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.215 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.215 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.215 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.216 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.216 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.216 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.216 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.216 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.216 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.216 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.217 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.217 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.217 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.217 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.217 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.217 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.218 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.218 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.218 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.218 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.218 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.219 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.219 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.219 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.219 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.219 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.219 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.219 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.220 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.220 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.220 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.220 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.220 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.220 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.221 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.221 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.221 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.221 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.221 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.221 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.221 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.222 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.222 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.222 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.222 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.222 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.222 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.222 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.223 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.223 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.223 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.223 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.223 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.223 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.223 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.224 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.224 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.224 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.224 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.224 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.224 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.225 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.225 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.225 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.225 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.225 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.225 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.225 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.226 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.226 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.226 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.226 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.226 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.226 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.227 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.227 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.227 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.227 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.227 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.227 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.227 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.228 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.228 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.228 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.228 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.228 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.228 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.228 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.229 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.229 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.229 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.229 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.229 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.229 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.229 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.230 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.230 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.230 228580 DEBUG oslo_service.service [None req-5a900ed8-9800-4ed4-b980-15913834d6e3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.231 228580 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.260 228580 INFO nova.virt.node [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Determined node identity ff2f17cb-ff1d-4da7-9560-4be741380cb1 from /var/lib/nova/compute_id
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.261 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.262 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.262 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.263 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.278 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fb57c47a340> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.281 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fb57c47a340> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.282 228580 INFO nova.virt.libvirt.driver [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Connection event '1' reason 'None'
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.289 228580 INFO nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Libvirt host capabilities <capabilities>
Dec 06 10:00:37 compute-1 nova_compute[228576]: 
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <host>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <uuid>9a5f3f62-e1ed-4c63-8d00-a3c5e56bbddc</uuid>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <cpu>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <arch>x86_64</arch>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model>EPYC-Rome-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <vendor>AMD</vendor>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <microcode version='16777317'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <signature family='23' model='49' stepping='0'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='x2apic'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='tsc-deadline'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='osxsave'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='hypervisor'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='tsc_adjust'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='spec-ctrl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='stibp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='arch-capabilities'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='ssbd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='cmp_legacy'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='topoext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='virt-ssbd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='lbrv'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='tsc-scale'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='vmcb-clean'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='pause-filter'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='pfthreshold'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='svme-addr-chk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='rdctl-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='skip-l1dfl-vmentry'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='mds-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature name='pschange-mc-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <pages unit='KiB' size='4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <pages unit='KiB' size='2048'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <pages unit='KiB' size='1048576'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </cpu>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <power_management>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <suspend_mem/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </power_management>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <iommu support='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <migration_features>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <live/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <uri_transports>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <uri_transport>tcp</uri_transport>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <uri_transport>rdma</uri_transport>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </uri_transports>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </migration_features>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <topology>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <cells num='1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <cell id='0'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:           <memory unit='KiB'>7864312</memory>
Dec 06 10:00:37 compute-1 nova_compute[228576]:           <pages unit='KiB' size='4'>1966078</pages>
Dec 06 10:00:37 compute-1 nova_compute[228576]:           <pages unit='KiB' size='2048'>0</pages>
Dec 06 10:00:37 compute-1 nova_compute[228576]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 06 10:00:37 compute-1 nova_compute[228576]:           <distances>
Dec 06 10:00:37 compute-1 nova_compute[228576]:             <sibling id='0' value='10'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:           </distances>
Dec 06 10:00:37 compute-1 nova_compute[228576]:           <cpus num='8'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:           </cpus>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         </cell>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </cells>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </topology>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <cache>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </cache>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <secmodel>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model>selinux</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <doi>0</doi>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </secmodel>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <secmodel>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model>dac</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <doi>0</doi>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </secmodel>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </host>
Dec 06 10:00:37 compute-1 nova_compute[228576]: 
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <guest>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <os_type>hvm</os_type>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <arch name='i686'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <wordsize>32</wordsize>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <domain type='qemu'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <domain type='kvm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </arch>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <features>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <pae/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <nonpae/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <acpi default='on' toggle='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <apic default='on' toggle='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <cpuselection/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <deviceboot/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <disksnapshot default='on' toggle='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <externalSnapshot/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </features>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </guest>
Dec 06 10:00:37 compute-1 nova_compute[228576]: 
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <guest>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <os_type>hvm</os_type>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <arch name='x86_64'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <wordsize>64</wordsize>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <domain type='qemu'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <domain type='kvm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </arch>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <features>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <acpi default='on' toggle='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <apic default='on' toggle='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <cpuselection/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <deviceboot/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <disksnapshot default='on' toggle='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <externalSnapshot/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </features>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </guest>
Dec 06 10:00:37 compute-1 nova_compute[228576]: 
Dec 06 10:00:37 compute-1 nova_compute[228576]: </capabilities>
Dec 06 10:00:37 compute-1 nova_compute[228576]: 
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.296 228580 DEBUG nova.virt.libvirt.volume.mount [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.297 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.302 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 06 10:00:37 compute-1 nova_compute[228576]: <domainCapabilities>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <domain>kvm</domain>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <arch>i686</arch>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <vcpu max='4096'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <iothreads supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <os supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <enum name='firmware'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <loader supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>rom</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pflash</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='readonly'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>yes</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>no</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='secure'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>no</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </loader>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </os>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <cpu>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='host-passthrough' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='hostPassthroughMigratable'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>on</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>off</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='maximum' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='maximumMigratable'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>on</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>off</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='host-model' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <vendor>AMD</vendor>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='x2apic'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='hypervisor'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='stibp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='ssbd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='overflow-recov'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='succor'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='ibrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='lbrv'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='tsc-scale'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='flushbyasid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='pause-filter'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='pfthreshold'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='disable' name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='custom' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cooperlake'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cooperlake-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cooperlake-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Dhyana-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Genoa'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amd-psfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='auto-ibrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='stibp-always-on'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amd-psfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='auto-ibrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='stibp-always-on'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Milan'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Milan-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Milan-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amd-psfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='stibp-always-on'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='GraniteRapids'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='prefetchiti'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='GraniteRapids-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='prefetchiti'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='GraniteRapids-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10-128'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10-256'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10-512'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='prefetchiti'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v6'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v7'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='KnightsMill'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512er'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512pf'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='KnightsMill-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512er'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512pf'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G4-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tbm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G5-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tbm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SierraForest'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cmpccxadd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SierraForest-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cmpccxadd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 ceph-mon[79770]: pgmap v573: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 10:00:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1794266355' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='athlon'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='athlon-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='core2duo'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='core2duo-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='coreduo'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='coreduo-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='n270'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='n270-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='phenom'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='phenom-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </cpu>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <memoryBacking supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <enum name='sourceType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>file</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>anonymous</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>memfd</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </memoryBacking>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <devices>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <disk supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='diskDevice'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>disk</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>cdrom</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>floppy</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>lun</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='bus'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>fdc</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>scsi</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>usb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>sata</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-non-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </disk>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <graphics supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vnc</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>egl-headless</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>dbus</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </graphics>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <video supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='modelType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vga</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>cirrus</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>none</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>bochs</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>ramfb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </video>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <hostdev supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='mode'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>subsystem</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='startupPolicy'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>default</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>mandatory</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>requisite</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>optional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='subsysType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>usb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pci</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>scsi</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='capsType'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='pciBackend'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </hostdev>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <rng supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-non-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendModel'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>random</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>egd</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>builtin</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </rng>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <filesystem supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='driverType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>path</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>handle</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtiofs</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </filesystem>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <tpm supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tpm-tis</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tpm-crb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendModel'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>emulator</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>external</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendVersion'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>2.0</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </tpm>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <redirdev supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='bus'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>usb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </redirdev>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <channel supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pty</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>unix</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </channel>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <crypto supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>qemu</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendModel'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>builtin</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </crypto>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <interface supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>default</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>passt</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </interface>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <panic supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>isa</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>hyperv</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </panic>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <console supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>null</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vc</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pty</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>dev</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>file</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pipe</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>stdio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>udp</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tcp</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>unix</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>qemu-vdagent</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>dbus</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </console>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </devices>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <features>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <gic supported='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <vmcoreinfo supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <genid supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <backingStoreInput supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <backup supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <async-teardown supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <ps2 supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <sev supported='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <sgx supported='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <hyperv supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='features'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>relaxed</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vapic</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>spinlocks</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vpindex</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>runtime</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>synic</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>stimer</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>reset</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vendor_id</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>frequencies</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>reenlightenment</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tlbflush</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>ipi</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>avic</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>emsr_bitmap</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>xmm_input</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <defaults>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <spinlocks>4095</spinlocks>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <stimer_direct>on</stimer_direct>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <tlbflush_direct>on</tlbflush_direct>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <tlbflush_extended>on</tlbflush_extended>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </defaults>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </hyperv>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <launchSecurity supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='sectype'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tdx</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </launchSecurity>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </features>
Dec 06 10:00:37 compute-1 nova_compute[228576]: </domainCapabilities>
Dec 06 10:00:37 compute-1 nova_compute[228576]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.308 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 06 10:00:37 compute-1 nova_compute[228576]: <domainCapabilities>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <domain>kvm</domain>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <arch>i686</arch>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <vcpu max='240'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <iothreads supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <os supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <enum name='firmware'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <loader supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>rom</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pflash</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='readonly'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>yes</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>no</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='secure'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>no</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </loader>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </os>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <cpu>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='host-passthrough' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='hostPassthroughMigratable'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>on</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>off</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='maximum' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='maximumMigratable'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>on</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>off</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='host-model' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <vendor>AMD</vendor>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='x2apic'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='hypervisor'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='stibp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='ssbd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='overflow-recov'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='succor'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='ibrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='lbrv'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='tsc-scale'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='flushbyasid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='pause-filter'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='pfthreshold'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='disable' name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='custom' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cooperlake'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cooperlake-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cooperlake-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Dhyana-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Genoa'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amd-psfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='auto-ibrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='stibp-always-on'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amd-psfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='auto-ibrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='stibp-always-on'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Milan'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Milan-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Milan-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amd-psfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='stibp-always-on'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='GraniteRapids'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='prefetchiti'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='GraniteRapids-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='prefetchiti'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='GraniteRapids-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10-128'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10-256'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10-512'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='prefetchiti'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v6'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v7'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='KnightsMill'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512er'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512pf'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='KnightsMill-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512er'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512pf'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G4-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tbm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G5-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tbm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SierraForest'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cmpccxadd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SierraForest-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cmpccxadd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='athlon'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='athlon-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='core2duo'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='core2duo-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='coreduo'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='coreduo-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='n270'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='n270-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='phenom'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='phenom-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </cpu>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <memoryBacking supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <enum name='sourceType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>file</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>anonymous</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>memfd</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </memoryBacking>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <devices>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <disk supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='diskDevice'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>disk</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>cdrom</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>floppy</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>lun</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='bus'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>ide</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>fdc</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>scsi</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>usb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>sata</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-non-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </disk>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <graphics supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vnc</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>egl-headless</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>dbus</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </graphics>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <video supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='modelType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vga</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>cirrus</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>none</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>bochs</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>ramfb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </video>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <hostdev supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='mode'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>subsystem</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='startupPolicy'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>default</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>mandatory</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>requisite</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>optional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='subsysType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>usb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pci</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>scsi</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='capsType'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='pciBackend'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </hostdev>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <rng supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-non-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendModel'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>random</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>egd</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>builtin</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </rng>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <filesystem supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='driverType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>path</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>handle</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtiofs</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </filesystem>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <tpm supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tpm-tis</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tpm-crb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendModel'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>emulator</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>external</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendVersion'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>2.0</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </tpm>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <redirdev supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='bus'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>usb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </redirdev>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <channel supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pty</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>unix</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </channel>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <crypto supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>qemu</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendModel'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>builtin</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </crypto>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <interface supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>default</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>passt</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </interface>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <panic supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>isa</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>hyperv</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </panic>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <console supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>null</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vc</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pty</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>dev</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>file</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pipe</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>stdio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>udp</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tcp</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>unix</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>qemu-vdagent</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>dbus</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </console>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </devices>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <features>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <gic supported='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <vmcoreinfo supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <genid supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <backingStoreInput supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <backup supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <async-teardown supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <ps2 supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <sev supported='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <sgx supported='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <hyperv supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='features'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>relaxed</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vapic</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>spinlocks</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vpindex</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>runtime</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>synic</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>stimer</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>reset</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vendor_id</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>frequencies</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>reenlightenment</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tlbflush</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>ipi</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>avic</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>emsr_bitmap</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>xmm_input</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <defaults>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <spinlocks>4095</spinlocks>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <stimer_direct>on</stimer_direct>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <tlbflush_direct>on</tlbflush_direct>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <tlbflush_extended>on</tlbflush_extended>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </defaults>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </hyperv>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <launchSecurity supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='sectype'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tdx</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </launchSecurity>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </features>
Dec 06 10:00:37 compute-1 nova_compute[228576]: </domainCapabilities>
Dec 06 10:00:37 compute-1 nova_compute[228576]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.337 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.342 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 06 10:00:37 compute-1 nova_compute[228576]: <domainCapabilities>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <domain>kvm</domain>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <arch>x86_64</arch>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <vcpu max='4096'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <iothreads supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <os supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <enum name='firmware'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>efi</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <loader supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>rom</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pflash</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='readonly'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>yes</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>no</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='secure'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>yes</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>no</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </loader>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </os>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <cpu>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='host-passthrough' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='hostPassthroughMigratable'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>on</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>off</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='maximum' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='maximumMigratable'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>on</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>off</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='host-model' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <vendor>AMD</vendor>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='x2apic'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='hypervisor'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='stibp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='ssbd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='overflow-recov'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='succor'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='ibrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='lbrv'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='tsc-scale'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='flushbyasid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='pause-filter'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='pfthreshold'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='disable' name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='custom' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cooperlake'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cooperlake-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cooperlake-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Dhyana-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Genoa'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amd-psfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='auto-ibrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='stibp-always-on'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amd-psfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='auto-ibrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='stibp-always-on'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Milan'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Milan-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Milan-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amd-psfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='stibp-always-on'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='GraniteRapids'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='prefetchiti'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='GraniteRapids-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='prefetchiti'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='GraniteRapids-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10-128'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10-256'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10-512'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='prefetchiti'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v6'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v7'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='KnightsMill'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512er'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512pf'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='KnightsMill-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512er'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512pf'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G4-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tbm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G5-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tbm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SierraForest'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cmpccxadd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SierraForest-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cmpccxadd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='athlon'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='athlon-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='core2duo'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='core2duo-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='coreduo'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='coreduo-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='n270'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='n270-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='phenom'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='phenom-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </cpu>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <memoryBacking supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <enum name='sourceType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>file</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>anonymous</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>memfd</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </memoryBacking>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <devices>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <disk supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='diskDevice'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>disk</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>cdrom</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>floppy</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>lun</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='bus'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>fdc</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>scsi</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>usb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>sata</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-non-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </disk>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <graphics supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vnc</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>egl-headless</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>dbus</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </graphics>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <video supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='modelType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vga</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>cirrus</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>none</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>bochs</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>ramfb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </video>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <hostdev supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='mode'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>subsystem</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='startupPolicy'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>default</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>mandatory</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>requisite</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>optional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='subsysType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>usb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pci</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>scsi</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='capsType'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='pciBackend'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </hostdev>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <rng supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-non-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendModel'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>random</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>egd</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>builtin</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </rng>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <filesystem supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='driverType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>path</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>handle</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtiofs</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </filesystem>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <tpm supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tpm-tis</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tpm-crb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendModel'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>emulator</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>external</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendVersion'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>2.0</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </tpm>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <redirdev supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='bus'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>usb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </redirdev>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <channel supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pty</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>unix</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </channel>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <crypto supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>qemu</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendModel'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>builtin</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </crypto>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <interface supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>default</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>passt</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </interface>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <panic supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>isa</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>hyperv</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </panic>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <console supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>null</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vc</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pty</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>dev</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>file</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pipe</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>stdio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>udp</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tcp</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>unix</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>qemu-vdagent</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>dbus</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </console>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </devices>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <features>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <gic supported='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <vmcoreinfo supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <genid supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <backingStoreInput supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <backup supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <async-teardown supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <ps2 supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <sev supported='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <sgx supported='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <hyperv supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='features'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>relaxed</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vapic</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>spinlocks</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vpindex</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>runtime</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>synic</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>stimer</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>reset</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vendor_id</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>frequencies</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>reenlightenment</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tlbflush</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>ipi</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>avic</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>emsr_bitmap</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>xmm_input</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <defaults>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <spinlocks>4095</spinlocks>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <stimer_direct>on</stimer_direct>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <tlbflush_direct>on</tlbflush_direct>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <tlbflush_extended>on</tlbflush_extended>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </defaults>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </hyperv>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <launchSecurity supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='sectype'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tdx</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </launchSecurity>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </features>
Dec 06 10:00:37 compute-1 nova_compute[228576]: </domainCapabilities>
Dec 06 10:00:37 compute-1 nova_compute[228576]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.414 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 06 10:00:37 compute-1 nova_compute[228576]: <domainCapabilities>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <path>/usr/libexec/qemu-kvm</path>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <domain>kvm</domain>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <arch>x86_64</arch>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <vcpu max='240'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <iothreads supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <os supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <enum name='firmware'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <loader supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>rom</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pflash</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='readonly'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>yes</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>no</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='secure'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>no</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </loader>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </os>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <cpu>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='host-passthrough' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='hostPassthroughMigratable'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>on</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>off</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='maximum' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='maximumMigratable'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>on</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>off</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='host-model' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <vendor>AMD</vendor>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='x2apic'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='tsc-deadline'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='hypervisor'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='tsc_adjust'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='spec-ctrl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='stibp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='ssbd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='cmp_legacy'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='overflow-recov'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='succor'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='ibrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='amd-ssbd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='virt-ssbd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='lbrv'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='tsc-scale'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='vmcb-clean'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='flushbyasid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='pause-filter'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='pfthreshold'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='svme-addr-chk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <feature policy='disable' name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <mode name='custom' supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Broadwell-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:00:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:37.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cascadelake-Server-v5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cooperlake'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cooperlake-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Cooperlake-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Denverton-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Dhyana-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Genoa'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amd-psfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='auto-ibrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='stibp-always-on'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Genoa-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amd-psfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='auto-ibrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='stibp-always-on'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Milan'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Milan-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Milan-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amd-psfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='no-nested-data-bp'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='null-sel-clr-base'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='stibp-always-on'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-Rome-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='EPYC-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='GraniteRapids'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='prefetchiti'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='GraniteRapids-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='prefetchiti'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='GraniteRapids-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10-128'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10-256'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx10-512'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='prefetchiti'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Haswell-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-noTSX'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v6'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Icelake-Server-v7'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='IvyBridge-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='KnightsMill'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512er'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512pf'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='KnightsMill-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4fmaps'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-4vnniw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512er'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512pf'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G4-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tbm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Opteron_G5-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fma4'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tbm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xop'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SapphireRapids-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='amx-tile'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-bf16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-fp16'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512-vpopcntdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bitalg'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vbmi2'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrc'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fzrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='la57'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='taa-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='tsx-ldtrk'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xfd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SierraForest'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cmpccxadd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='SierraForest-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ifma'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-ne-convert'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx-vnni-int8'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='bus-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cmpccxadd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fbsdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='fsrs'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ibrs-all'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mcdt-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pbrsb-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='psdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='sbdr-ssdp-no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='serialize'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vaes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='vpclmulqdq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Client-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='hle'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='rtm'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Skylake-Server-v5'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512bw'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512cd'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512dq'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512f'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='avx512vl'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='invpcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pcid'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='pku'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='mpx'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v2'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v3'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='core-capability'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='split-lock-detect'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='Snowridge-v4'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='cldemote'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='erms'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='gfni'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdir64b'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='movdiri'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='xsaves'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='athlon'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='athlon-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='core2duo'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='core2duo-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='coreduo'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='coreduo-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='n270'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='n270-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='ss'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='phenom'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <blockers model='phenom-v1'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnow'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <feature name='3dnowext'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </blockers>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </mode>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </cpu>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <memoryBacking supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <enum name='sourceType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>file</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>anonymous</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <value>memfd</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </memoryBacking>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <devices>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <disk supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='diskDevice'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>disk</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>cdrom</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>floppy</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>lun</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='bus'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>ide</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>fdc</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>scsi</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>usb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>sata</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-non-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </disk>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <graphics supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vnc</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>egl-headless</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>dbus</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </graphics>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <video supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='modelType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vga</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>cirrus</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>none</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>bochs</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>ramfb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </video>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <hostdev supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='mode'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>subsystem</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='startupPolicy'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>default</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>mandatory</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>requisite</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>optional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='subsysType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>usb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pci</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>scsi</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='capsType'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='pciBackend'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </hostdev>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <rng supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtio-non-transitional</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendModel'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>random</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>egd</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>builtin</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </rng>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <filesystem supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='driverType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>path</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>handle</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>virtiofs</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </filesystem>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <tpm supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tpm-tis</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tpm-crb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendModel'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>emulator</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>external</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendVersion'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>2.0</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </tpm>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <redirdev supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='bus'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>usb</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </redirdev>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <channel supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pty</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>unix</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </channel>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <crypto supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>qemu</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendModel'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>builtin</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </crypto>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <interface supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='backendType'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>default</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>passt</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </interface>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <panic supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='model'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>isa</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>hyperv</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </panic>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <console supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='type'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>null</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vc</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pty</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>dev</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>file</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>pipe</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>stdio</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>udp</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tcp</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>unix</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>qemu-vdagent</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>dbus</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </console>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </devices>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   <features>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <gic supported='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <vmcoreinfo supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <genid supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <backingStoreInput supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <backup supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <async-teardown supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <ps2 supported='yes'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <sev supported='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <sgx supported='no'/>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <hyperv supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='features'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>relaxed</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vapic</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>spinlocks</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vpindex</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>runtime</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>synic</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>stimer</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>reset</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>vendor_id</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>frequencies</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>reenlightenment</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tlbflush</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>ipi</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>avic</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>emsr_bitmap</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>xmm_input</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <defaults>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <spinlocks>4095</spinlocks>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <stimer_direct>on</stimer_direct>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <tlbflush_direct>on</tlbflush_direct>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <tlbflush_extended>on</tlbflush_extended>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </defaults>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </hyperv>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     <launchSecurity supported='yes'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       <enum name='sectype'>
Dec 06 10:00:37 compute-1 nova_compute[228576]:         <value>tdx</value>
Dec 06 10:00:37 compute-1 nova_compute[228576]:       </enum>
Dec 06 10:00:37 compute-1 nova_compute[228576]:     </launchSecurity>
Dec 06 10:00:37 compute-1 nova_compute[228576]:   </features>
Dec 06 10:00:37 compute-1 nova_compute[228576]: </domainCapabilities>
Dec 06 10:00:37 compute-1 nova_compute[228576]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.480 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.480 228580 INFO nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Secure Boot support detected
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.483 228580 INFO nova.virt.libvirt.driver [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.483 228580 INFO nova.virt.libvirt.driver [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.493 228580 DEBUG nova.virt.libvirt.driver [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.510 228580 INFO nova.virt.node [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Determined node identity ff2f17cb-ff1d-4da7-9560-4be741380cb1 from /var/lib/nova/compute_id
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.531 228580 WARNING nova.compute.manager [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Compute nodes ['ff2f17cb-ff1d-4da7-9560-4be741380cb1'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.573 228580 INFO nova.compute.manager [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.602 228580 WARNING nova.compute.manager [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.603 228580 DEBUG oslo_concurrency.lockutils [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.603 228580 DEBUG oslo_concurrency.lockutils [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.604 228580 DEBUG oslo_concurrency.lockutils [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.604 228580 DEBUG nova.compute.resource_tracker [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:00:37 compute-1 nova_compute[228576]: 2025-12-06 10:00:37.605 228580 DEBUG oslo_concurrency.processutils [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:00:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:37.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:37 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:37 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:37 compute-1 rsyslogd[1007]: imjournal from <np0005548916:nova_compute>: begin to drop messages due to rate-limiting
Dec 06 10:00:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:00:38 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/656373637' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:00:38 compute-1 nova_compute[228576]: 2025-12-06 10:00:38.088 228580 DEBUG oslo_concurrency.processutils [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:00:38 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Dec 06 10:00:38 compute-1 systemd[1]: Started libvirt nodedev daemon.
Dec 06 10:00:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1675628201' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:00:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/656373637' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:00:38 compute-1 nova_compute[228576]: 2025-12-06 10:00:38.427 228580 WARNING nova.virt.libvirt.driver [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:00:38 compute-1 nova_compute[228576]: 2025-12-06 10:00:38.429 228580 DEBUG nova.compute.resource_tracker [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5232MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:00:38 compute-1 nova_compute[228576]: 2025-12-06 10:00:38.429 228580 DEBUG oslo_concurrency.lockutils [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:00:38 compute-1 nova_compute[228576]: 2025-12-06 10:00:38.429 228580 DEBUG oslo_concurrency.lockutils [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:00:38 compute-1 nova_compute[228576]: 2025-12-06 10:00:38.449 228580 WARNING nova.compute.resource_tracker [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] No compute node record for compute-1.ctlplane.example.com:ff2f17cb-ff1d-4da7-9560-4be741380cb1: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host ff2f17cb-ff1d-4da7-9560-4be741380cb1 could not be found.
Dec 06 10:00:38 compute-1 nova_compute[228576]: 2025-12-06 10:00:38.483 228580 INFO nova.compute.resource_tracker [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: ff2f17cb-ff1d-4da7-9560-4be741380cb1
Dec 06 10:00:38 compute-1 nova_compute[228576]: 2025-12-06 10:00:38.544 228580 DEBUG nova.compute.resource_tracker [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:00:38 compute-1 nova_compute[228576]: 2025-12-06 10:00:38.545 228580 DEBUG nova.compute.resource_tracker [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:00:38 compute-1 nova_compute[228576]: 2025-12-06 10:00:38.687 228580 INFO nova.scheduler.client.report [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] [req-cfa68dd7-6548-4ef8-8713-5d047a3ed750] Created resource provider record via placement API for resource provider with UUID ff2f17cb-ff1d-4da7-9560-4be741380cb1 and name compute-1.ctlplane.example.com.
Dec 06 10:00:38 compute-1 nova_compute[228576]: 2025-12-06 10:00:38.740 228580 DEBUG oslo_concurrency.processutils [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:00:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:00:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:39 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc0029b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:00:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3822391769' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:00:39 compute-1 nova_compute[228576]: 2025-12-06 10:00:39.191 228580 DEBUG oslo_concurrency.processutils [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:00:39 compute-1 nova_compute[228576]: 2025-12-06 10:00:39.200 228580 DEBUG nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 06 10:00:39 compute-1 nova_compute[228576]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 06 10:00:39 compute-1 nova_compute[228576]: 2025-12-06 10:00:39.200 228580 INFO nova.virt.libvirt.host [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] kernel doesn't support AMD SEV
Dec 06 10:00:39 compute-1 nova_compute[228576]: 2025-12-06 10:00:39.202 228580 DEBUG nova.compute.provider_tree [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Updating inventory in ProviderTree for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:00:39 compute-1 nova_compute[228576]: 2025-12-06 10:00:39.203 228580 DEBUG nova.virt.libvirt.driver [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 06 10:00:39 compute-1 nova_compute[228576]: 2025-12-06 10:00:39.295 228580 DEBUG nova.scheduler.client.report [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Updated inventory for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 06 10:00:39 compute-1 nova_compute[228576]: 2025-12-06 10:00:39.296 228580 DEBUG nova.compute.provider_tree [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Updating resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 06 10:00:39 compute-1 nova_compute[228576]: 2025-12-06 10:00:39.296 228580 DEBUG nova.compute.provider_tree [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Updating inventory in ProviderTree for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:00:39 compute-1 ceph-mon[79770]: pgmap v574: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 10:00:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:00:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2444252692' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:00:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3822391769' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:00:39 compute-1 nova_compute[228576]: 2025-12-06 10:00:39.370 228580 DEBUG nova.compute.provider_tree [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Updating resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 06 10:00:39 compute-1 nova_compute[228576]: 2025-12-06 10:00:39.395 228580 DEBUG nova.compute.resource_tracker [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:00:39 compute-1 nova_compute[228576]: 2025-12-06 10:00:39.396 228580 DEBUG oslo_concurrency.lockutils [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.966s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:00:39 compute-1 nova_compute[228576]: 2025-12-06 10:00:39.396 228580 DEBUG nova.service [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 06 10:00:39 compute-1 nova_compute[228576]: 2025-12-06 10:00:39.476 228580 DEBUG nova.service [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 06 10:00:39 compute-1 nova_compute[228576]: 2025-12-06 10:00:39.477 228580 DEBUG nova.servicegroup.drivers.db [None req-631af40b-fc70-41ee-854f-2584c3d4f164 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 06 10:00:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:39.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:00:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:39.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:00:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:39 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:39 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4184093383' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:00:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:41 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:41 compute-1 ceph-mon[79770]: pgmap v575: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:00:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:00:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:41.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:00:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:00:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:41.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:00:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:41 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:41 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:43 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:43 compute-1 ceph-mon[79770]: pgmap v576: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:00:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:43.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:00:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:43.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:00:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:00:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:43 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:43 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:45 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:45 compute-1 ceph-mon[79770]: pgmap v577: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 10:00:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:00:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:45.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:00:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:00:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:45.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:00:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:45 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:45 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:47 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:47.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:47 compute-1 ceph-mon[79770]: pgmap v578: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:00:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:00:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:47.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:00:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:47 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:47 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:00:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:49 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:49.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:49 compute-1 ceph-mon[79770]: pgmap v579: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 10:00:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:49.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:49 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:49 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:50 compute-1 sudo[228969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:00:50 compute-1 sudo[228969]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:00:50 compute-1 sudo[228969]: pam_unix(sudo:session): session closed for user root
Dec 06 10:00:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:51 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:51.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:51 compute-1 ceph-mon[79770]: pgmap v580: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:00:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:00:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:51.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:00:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:51 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:51 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:53 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:53.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:53 compute-1 ceph-mon[79770]: pgmap v581: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:00:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:53.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:00:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:53 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:53 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:00:54.272 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:00:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:00:54.274 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:00:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:00:54.274 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:00:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:00:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:55 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:55.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:55 compute-1 ceph-mon[79770]: pgmap v582: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 10:00:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:55.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:55 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:55 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:56 compute-1 podman[228998]: 2025-12-06 10:00:56.816247676 +0000 UTC m=+0.114139259 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:00:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:57 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:57.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:57 compute-1 ceph-mon[79770]: pgmap v583: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:00:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:57.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:57 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:57 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:00:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:59 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:00:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:00:59.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:00:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:00:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:00:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:00:59.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:00:59 compute-1 ceph-mon[79770]: pgmap v584: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 10:00:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:59 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:00:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:00:59 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:01 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:01.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:01 compute-1 anacron[4460]: Job `cron.monthly' started
Dec 06 10:01:01 compute-1 anacron[4460]: Job `cron.monthly' terminated
Dec 06 10:01:01 compute-1 anacron[4460]: Normal exit (3 jobs run)
Dec 06 10:01:01 compute-1 podman[229026]: 2025-12-06 10:01:01.752415312 +0000 UTC m=+0.056456001 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent)
Dec 06 10:01:01 compute-1 CROND[229049]: (root) CMD (run-parts /etc/cron.hourly)
Dec 06 10:01:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:01.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:01 compute-1 run-parts[229052]: (/etc/cron.hourly) starting 0anacron
Dec 06 10:01:01 compute-1 run-parts[229058]: (/etc/cron.hourly) finished 0anacron
Dec 06 10:01:01 compute-1 CROND[229048]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 06 10:01:01 compute-1 ceph-mon[79770]: pgmap v585: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:01:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:01 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:01 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:02 compute-1 ceph-mon[79770]: pgmap v586: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:01:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:03 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:03.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:03.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:01:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:03 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:03 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:05 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:05 compute-1 ceph-mon[79770]: pgmap v587: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 10:01:05 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3133958809' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:01:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:05.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:05 compute-1 podman[229061]: 2025-12-06 10:01:05.764206792 +0000 UTC m=+0.063826159 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd)
Dec 06 10:01:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:05.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:05 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:05 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:06 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3133958809' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:01:06 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/4193963036' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:01:06 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/4193963036' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:01:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:07 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:07 compute-1 rsyslogd[1007]: imjournal: 1887 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Dec 06 10:01:07 compute-1 ceph-mon[79770]: pgmap v588: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:01:07 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/162190854' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:01:07 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/162190854' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:01:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:07.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:07.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:07 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:07 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:01:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:09 compute-1 ceph-mon[79770]: pgmap v589: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 10:01:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:01:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:09.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:09.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:09 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:10 compute-1 sudo[229083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:01:10 compute-1 sudo[229083]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:01:10 compute-1 sudo[229083]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:11 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:11.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:11 compute-1 ceph-mon[79770]: pgmap v590: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:01:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:11.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:11 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:11 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:13 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:13.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:13 compute-1 ceph-mon[79770]: pgmap v591: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:01:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:01:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:13.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:13 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:13 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:15 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:15.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:15.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:15 compute-1 ceph-mon[79770]: pgmap v592: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 10:01:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:15 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:15 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:16 compute-1 ceph-mon[79770]: pgmap v593: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:01:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:17 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:17.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:17.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:17 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:17 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:01:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:19 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:19 compute-1 ceph-mon[79770]: pgmap v594: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 10:01:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:19.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:19.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:19 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4bc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:20 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:21 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:21 compute-1 ceph-mon[79770]: pgmap v595: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:01:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:21.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:01:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:21.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:01:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:21 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:22 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b0000f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:23 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:23.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:23 compute-1 ceph-mon[79770]: pgmap v596: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:01:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:01:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:23.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:23 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:24 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:24 compute-1 sudo[229118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:01:24 compute-1 sudo[229118]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:01:24 compute-1 sudo[229118]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:24 compute-1 sudo[229143]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:01:24 compute-1 sudo[229143]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:01:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:01:24 compute-1 sudo[229143]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:25 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c40041f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:25.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:25 compute-1 ceph-mon[79770]: pgmap v597: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 10:01:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:01:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:01:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:01:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:01:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:01:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:01:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:01:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:25.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:25 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:26 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4ac003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:27 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b0001a90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:27 compute-1 nova_compute[228576]: 2025-12-06 10:01:27.478 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:27 compute-1 nova_compute[228576]: 2025-12-06 10:01:27.517 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:27.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:27 compute-1 podman[229201]: 2025-12-06 10:01:27.815091541 +0000 UTC m=+0.106267169 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:01:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:27.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:28 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4dc003e40 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:28 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c4004210 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:28 compute-1 ceph-mon[79770]: pgmap v598: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:01:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:01:29 compute-1 ceph-mon[79770]: pgmap v599: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 10:01:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:29 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4c4004210 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:29.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:29.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:29 compute-1 sudo[229230]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:01:29 compute-1 sudo[229230]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:01:29 compute-1 sudo[229230]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:29 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b8001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[224757]: 06/12/2025 10:01:30 : epoch 6933fe9c : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa4b0001c10 fd 38 proxy ignored for local
Dec 06 10:01:30 compute-1 kernel: ganesha.nfsd[229115]: segfault at 50 ip 00007fa58ed6532e sp 00007fa5477fd210 error 4 in libntirpc.so.5.8[7fa58ed4a000+2c000] likely on CPU 4 (core 0, socket 4)
Dec 06 10:01:30 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 06 10:01:30 compute-1 systemd[1]: Started Process Core Dump (PID 229255/UID 0).
Dec 06 10:01:30 compute-1 sudo[229257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:01:30 compute-1 sudo[229257]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:01:30 compute-1 sudo[229257]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:30 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:01:30 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:01:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:01:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:31.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:01:31 compute-1 systemd-coredump[229256]: Process 224784 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 59:
                                                    #0  0x00007fa58ed6532e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 06 10:01:31 compute-1 systemd[1]: systemd-coredump@7-229255-0.service: Deactivated successfully.
Dec 06 10:01:31 compute-1 systemd[1]: systemd-coredump@7-229255-0.service: Consumed 1.526s CPU time.
Dec 06 10:01:31 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 10:01:31 compute-1 podman[229288]: 2025-12-06 10:01:31.724887996 +0000 UTC m=+0.034846217 container died 6dc139c09dbc99a313d5333e87cc0ba0df15ffda5b12614866d45ea226e1d6ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Dec 06 10:01:31 compute-1 systemd[1]: var-lib-containers-storage-overlay-8c79bfeb25e587d3943a06906c158dd3f62a52f59079e06c39c4ba774c28c036-merged.mount: Deactivated successfully.
Dec 06 10:01:31 compute-1 podman[229288]: 2025-12-06 10:01:31.76836964 +0000 UTC m=+0.078327861 container remove 6dc139c09dbc99a313d5333e87cc0ba0df15ffda5b12614866d45ea226e1d6ce (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0)
Dec 06 10:01:31 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec 06 10:01:31 compute-1 ceph-mon[79770]: pgmap v600: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:01:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:31.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:31 compute-1 podman[229302]: 2025-12-06 10:01:31.897085233 +0000 UTC m=+0.084644574 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 06 10:01:31 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec 06 10:01:31 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.784s CPU time.
Dec 06 10:01:32 compute-1 ceph-mon[79770]: pgmap v601: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:01:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:33.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:01:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:33.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:35 compute-1 ceph-mon[79770]: pgmap v602: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 10:01:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:35.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:35.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100136 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.473 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.474 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.474 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.474 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.542 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.543 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.543 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.543 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.543 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.544 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.544 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.544 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.544 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.610 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.611 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.611 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.611 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:01:36 compute-1 nova_compute[228576]: 2025-12-06 10:01:36.612 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:01:36 compute-1 podman[229347]: 2025-12-06 10:01:36.760518982 +0000 UTC m=+0.058689505 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd)
Dec 06 10:01:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:01:37 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/244775073' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:01:37 compute-1 nova_compute[228576]: 2025-12-06 10:01:37.075 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:01:37 compute-1 nova_compute[228576]: 2025-12-06 10:01:37.268 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:01:37 compute-1 nova_compute[228576]: 2025-12-06 10:01:37.270 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5231MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:01:37 compute-1 nova_compute[228576]: 2025-12-06 10:01:37.270 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:01:37 compute-1 nova_compute[228576]: 2025-12-06 10:01:37.271 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:01:37 compute-1 ceph-mon[79770]: pgmap v603: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:01:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2682509999' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:01:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/244775073' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:01:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3744631434' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:01:37 compute-1 nova_compute[228576]: 2025-12-06 10:01:37.486 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:01:37 compute-1 nova_compute[228576]: 2025-12-06 10:01:37.487 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:01:37 compute-1 nova_compute[228576]: 2025-12-06 10:01:37.557 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:01:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:37.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:37.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:01:37 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2859480221' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:01:38 compute-1 nova_compute[228576]: 2025-12-06 10:01:38.006 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:01:38 compute-1 nova_compute[228576]: 2025-12-06 10:01:38.013 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:01:38 compute-1 nova_compute[228576]: 2025-12-06 10:01:38.038 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:01:38 compute-1 nova_compute[228576]: 2025-12-06 10:01:38.040 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:01:38 compute-1 nova_compute[228576]: 2025-12-06 10:01:38.040 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.770s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:01:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1251141983' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:01:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2859480221' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:01:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1184462004' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:01:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:01:39 compute-1 ceph-mon[79770]: pgmap v604: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 10:01:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:01:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:01:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:39.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:01:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:39.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:41 compute-1 ceph-mon[79770]: pgmap v605: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:01:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:01:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:41.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:01:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:41.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:42 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 8.
Dec 06 10:01:42 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 10:01:42 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.784s CPU time.
Dec 06 10:01:42 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 10:01:42 compute-1 podman[229465]: 2025-12-06 10:01:42.415586336 +0000 UTC m=+0.057204039 container create cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 10:01:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8c7da4624aa519272ef2c8bd30d12c947da67ff2923b4958fe16726ed31e84/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 06 10:01:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8c7da4624aa519272ef2c8bd30d12c947da67ff2923b4958fe16726ed31e84/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:01:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8c7da4624aa519272ef2c8bd30d12c947da67ff2923b4958fe16726ed31e84/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 10:01:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8c7da4624aa519272ef2c8bd30d12c947da67ff2923b4958fe16726ed31e84/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 10:01:42 compute-1 podman[229465]: 2025-12-06 10:01:42.477728143 +0000 UTC m=+0.119345846 container init cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=squid)
Dec 06 10:01:42 compute-1 podman[229465]: 2025-12-06 10:01:42.385588808 +0000 UTC m=+0.027206581 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 10:01:42 compute-1 podman[229465]: 2025-12-06 10:01:42.488905274 +0000 UTC m=+0.130522957 container start cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Dec 06 10:01:42 compute-1 bash[229465]: cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9
Dec 06 10:01:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 06 10:01:42 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 10:01:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 06 10:01:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 06 10:01:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 06 10:01:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 06 10:01:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 06 10:01:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 06 10:01:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:01:43 compute-1 ceph-mon[79770]: pgmap v606: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:01:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:43.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:01:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:43.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:45 compute-1 ceph-mon[79770]: pgmap v607: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:01:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:01:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:45.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:01:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:45.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:47 compute-1 ceph-mon[79770]: pgmap v608: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:01:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:47.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:47.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:01:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:01:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:01:49 compute-1 ceph-mon[79770]: pgmap v609: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 597 B/s wr, 2 op/s
Dec 06 10:01:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:49.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:49.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:50 compute-1 sudo[229527]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:01:50 compute-1 sudo[229527]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:01:50 compute-1 sudo[229527]: pam_unix(sudo:session): session closed for user root
Dec 06 10:01:51 compute-1 ceph-mon[79770]: pgmap v610: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 06 10:01:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:51.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:51.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:53 compute-1 ceph-mon[79770]: pgmap v611: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Dec 06 10:01:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:01:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:53.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:01:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:01:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:53.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:01:54.273 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:01:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:01:54.274 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:01:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:01:54.274 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:01:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 06 10:01:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 10:01:55 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Dec 06 10:01:55 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1363450763' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec 06 10:01:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2248000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:55 compute-1 ceph-mon[79770]: pgmap v612: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 10:01:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/1363450763' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec 06 10:01:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/2413463250' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec 06 10:01:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:55.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:01:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:55.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:01:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:56 compute-1 ceph-mon[79770]: from='client.24542 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 06 10:01:56 compute-1 ceph-mon[79770]: from='client.24628 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 06 10:01:56 compute-1 ceph-mon[79770]: from='client.24628 -' entity='client.openstack' cmd=[{"prefix": "nfs cluster info", "cluster_id": "cephfs", "format": "json"}]: dispatch
Dec 06 10:01:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:01:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:57.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:01:57 compute-1 ceph-mon[79770]: pgmap v613: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Dec 06 10:01:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:01:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:57.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:01:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100158 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 10:01:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:01:58 compute-1 podman[229571]: 2025-12-06 10:01:58.850315723 +0000 UTC m=+0.146305030 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 10:01:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:01:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:01:59.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:59 compute-1 ceph-mon[79770]: pgmap v614: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 10:01:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:01:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:01:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:01:59.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:01:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:01:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:02:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:01.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:02:01 compute-1 ceph-mon[79770]: pgmap v615: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 10:02:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:02:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:01.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:02:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:02 compute-1 podman[229599]: 2025-12-06 10:02:02.743093996 +0000 UTC m=+0.053657563 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Dec 06 10:02:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:02:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:03.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:02:03 compute-1 ceph-mon[79770]: pgmap v616: 337 pgs: 337 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Dec 06 10:02:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:02:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:02:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:03.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:02:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:02:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:05.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:02:05 compute-1 ceph-mon[79770]: pgmap v617: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 79 KiB/s rd, 426 B/s wr, 131 op/s
Dec 06 10:02:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:05.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:07.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:07 compute-1 podman[229622]: 2025-12-06 10:02:07.751645797 +0000 UTC m=+0.062029196 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:02:07 compute-1 ceph-mon[79770]: pgmap v618: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 78 KiB/s rd, 0 B/s wr, 129 op/s
Dec 06 10:02:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:07.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:02:08 compute-1 ceph-mon[79770]: pgmap v619: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 78 KiB/s rd, 0 B/s wr, 129 op/s
Dec 06 10:02:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:09.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:09.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:02:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:10 compute-1 sudo[229644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:02:10 compute-1 sudo[229644]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:02:10 compute-1 sudo[229644]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:10 compute-1 ceph-mon[79770]: pgmap v620: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 78 KiB/s rd, 0 B/s wr, 129 op/s
Dec 06 10:02:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:11.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:11.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:13 compute-1 ceph-mon[79770]: pgmap v621: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 78 KiB/s rd, 0 B/s wr, 129 op/s
Dec 06 10:02:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:13.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:02:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:13.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:14 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/2729948875' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec 06 10:02:14 compute-1 ceph-mon[79770]: from='client.24637 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 06 10:02:14 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3432749316' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Dec 06 10:02:14 compute-1 ceph-mon[79770]: from='client.15012 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Dec 06 10:02:14 compute-1 ceph-mon[79770]: from='client.15012 -' entity='client.openstack' cmd=[{"prefix": "nfs cluster info", "cluster_id": "cephfs", "format": "json"}]: dispatch
Dec 06 10:02:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:15 compute-1 ceph-mon[79770]: pgmap v622: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 78 KiB/s rd, 0 B/s wr, 129 op/s
Dec 06 10:02:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:15.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:15.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:17 compute-1 ceph-mon[79770]: pgmap v623: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:02:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:17.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:17.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:02:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:19 compute-1 ceph-mon[79770]: pgmap v624: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 10:02:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:19.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:19.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:21 compute-1 ceph-mon[79770]: pgmap v625: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:02:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:21.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:02:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:21.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:02:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:23.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:23 compute-1 ceph-mon[79770]: pgmap v626: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:02:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:02:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:02:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:23.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:02:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:02:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:25.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:25 compute-1 ceph-mon[79770]: pgmap v627: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 10:02:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:25.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:26 compute-1 ceph-mon[79770]: pgmap v628: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:02:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:27.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:27.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:02:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:29 compute-1 ceph-mon[79770]: pgmap v629: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 10:02:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:29.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:29 compute-1 podman[229680]: 2025-12-06 10:02:29.796524489 +0000 UTC m=+0.103882051 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:02:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:02:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:29.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:02:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:30 compute-1 sudo[229707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:02:30 compute-1 sudo[229707]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:02:30 compute-1 sudo[229707]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:30 compute-1 sudo[229732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:02:30 compute-1 sudo[229732]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:02:30 compute-1 sudo[229771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:02:30 compute-1 sudo[229771]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:02:30 compute-1 sudo[229771]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:30 compute-1 sudo[229732]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:31 compute-1 ceph-mon[79770]: pgmap v630: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:02:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:02:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:02:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:02:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:02:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:02:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:02:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:02:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:31.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:02:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:31.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:02:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:33 compute-1 ceph-mon[79770]: pgmap v631: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:02:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:33.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:02:33 compute-1 podman[229814]: 2025-12-06 10:02:33.807135764 +0000 UTC m=+0.102534642 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 06 10:02:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:33.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:35 compute-1 ceph-mon[79770]: pgmap v632: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 10:02:35 compute-1 sudo[229834]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:02:35 compute-1 sudo[229834]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:02:35 compute-1 sudo[229834]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:35.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:35.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:36 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:02:36 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:02:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:37 compute-1 ceph-mon[79770]: pgmap v633: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:02:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:37.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:37.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.030 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.030 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.050 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.051 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.051 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.065 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.066 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.066 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.067 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.067 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.067 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.068 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.068 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.068 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:02:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.102 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.103 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.103 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.104 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.104 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:02:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2516502700' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:02:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:02:38 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1539383251' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.542 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.750 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.752 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5225MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.752 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.753 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:02:38 compute-1 podman[229883]: 2025-12-06 10:02:38.767366194 +0000 UTC m=+0.067119369 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 10:02:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.883 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.884 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:02:38 compute-1 nova_compute[228576]: 2025-12-06 10:02:38.929 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:02:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:02:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1691930506' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:02:39 compute-1 nova_compute[228576]: 2025-12-06 10:02:39.365 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:02:39 compute-1 nova_compute[228576]: 2025-12-06 10:02:39.371 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:02:39 compute-1 nova_compute[228576]: 2025-12-06 10:02:39.389 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:02:39 compute-1 nova_compute[228576]: 2025-12-06 10:02:39.390 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:02:39 compute-1 nova_compute[228576]: 2025-12-06 10:02:39.391 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:02:39 compute-1 ceph-mon[79770]: pgmap v634: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 10:02:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1539383251' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:02:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2540342928' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:02:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4280590991' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:02:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:02:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1691930506' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:02:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2010034084' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:02:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:39.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:02:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:39.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:02:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:41 compute-1 ceph-mon[79770]: pgmap v635: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:02:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:41.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:41.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:43 compute-1 ceph-mon[79770]: pgmap v636: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:02:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:02:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:43.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:02:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:02:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:43.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:45 compute-1 ceph-mon[79770]: pgmap v637: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 10:02:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:45.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:45.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240003c60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/795001534' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:02:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/795001534' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:02:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:47.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:47.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:48 compute-1 ceph-mon[79770]: pgmap v638: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:02:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:02:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100249 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:02:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:49 compute-1 ceph-mon[79770]: pgmap v639: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 10:02:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:02:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:49.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:02:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:02:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:49.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:02:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:50 compute-1 sudo[229929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:02:50 compute-1 sudo[229929]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:02:50 compute-1 sudo[229929]: pam_unix(sudo:session): session closed for user root
Dec 06 10:02:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:51 compute-1 ceph-mon[79770]: pgmap v640: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:02:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:02:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:51.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:02:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:51.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:53 compute-1 ceph-mon[79770]: pgmap v641: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:02:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:53.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:02:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:53.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:02:54.274 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:02:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:02:54.275 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:02:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:02:54.275 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:02:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:02:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:55 compute-1 ceph-mon[79770]: pgmap v642: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 10:02:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:55.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:55.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:57 compute-1 ceph-mon[79770]: pgmap v643: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 10:02:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:57.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:57.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:02:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:02:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:02:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:02:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:02:59 compute-1 ceph-mon[79770]: pgmap v644: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:02:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:02:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:02:59.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:02:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:02:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:02:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:02:59.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:00 compute-1 podman[229961]: 2025-12-06 10:03:00.787534522 +0000 UTC m=+0.088815487 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:03:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:03:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:03:01 compute-1 ceph-mon[79770]: pgmap v645: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:03:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:03:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:01.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:03:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:01.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:03 compute-1 ceph-mon[79770]: pgmap v646: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:03:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:03.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:03:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:03:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:03.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:03:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 10:03:04 compute-1 podman[229989]: 2025-12-06 10:03:04.771392785 +0000 UTC m=+0.066064862 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 06 10:03:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:05 compute-1 ceph-mon[79770]: pgmap v647: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 10:03:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:05.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:05.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:06 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:03:06.959 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:03:06 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:03:06.960 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:03:06 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:03:06.961 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:03:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:07 compute-1 ceph-mon[79770]: pgmap v648: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 10:03:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:07.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:07.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:03:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:09 compute-1 ceph-mon[79770]: pgmap v649: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 10:03:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:03:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:09.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:09 compute-1 podman[230011]: 2025-12-06 10:03:09.759175416 +0000 UTC m=+0.064995837 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd)
Dec 06 10:03:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:03:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:09.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:03:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:10 compute-1 sudo[230031]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:03:10 compute-1 sudo[230031]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:03:10 compute-1 sudo[230031]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100311 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 10:03:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:11 compute-1 ceph-mon[79770]: pgmap v650: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Dec 06 10:03:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:03:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:11.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:03:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:03:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:11.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:03:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:13 compute-1 ceph-mon[79770]: pgmap v651: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Dec 06 10:03:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:03:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:13.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:03:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:03:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:03:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:14.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:03:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:15 compute-1 ceph-mon[79770]: pgmap v652: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Dec 06 10:03:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:03:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:15.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:03:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:16.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:03:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:17.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:03:17 compute-1 ceph-mon[79770]: pgmap v653: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:03:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:18.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:03:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:19.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:19 compute-1 ceph-mon[79770]: pgmap v654: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:03:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:20.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:21.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:21 compute-1 ceph-mon[79770]: pgmap v655: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:22.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:23.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:23 compute-1 ceph-mon[79770]: pgmap v656: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:03:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:03:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:24.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:03:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:03:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:25.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:25 compute-1 ceph-mon[79770]: pgmap v657: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:03:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:26.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:03:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:27.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:27 compute-1 ceph-mon[79770]: pgmap v658: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:28.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0021d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:03:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:29.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:29 compute-1 ceph-mon[79770]: pgmap v659: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:03:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:30.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:30 compute-1 sudo[230067]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:03:30 compute-1 sudo[230067]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:03:30 compute-1 sudo[230067]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:03:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:31.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:03:31 compute-1 podman[230093]: 2025-12-06 10:03:31.812915022 +0000 UTC m=+0.114226008 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 06 10:03:31 compute-1 ceph-mon[79770]: pgmap v660: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:03:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:32.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:03:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:33.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:03:33 compute-1 ceph-mon[79770]: pgmap v661: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:34.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:03:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:35.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:03:35 compute-1 podman[230123]: 2025-12-06 10:03:35.80768273 +0000 UTC m=+0.105000763 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:03:35 compute-1 sudo[230140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:03:35 compute-1 sudo[230140]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:03:35 compute-1 sudo[230140]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:35 compute-1 ceph-mon[79770]: pgmap v662: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:35 compute-1 sudo[230168]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:03:35 compute-1 sudo[230168]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:03:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:36.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:36 compute-1 sudo[230168]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:36 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:03:36 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:03:36 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:03:36 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:03:36 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:03:36 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:03:36 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:03:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:03:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:37.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:03:37 compute-1 ceph-mon[79770]: pgmap v663: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2391830354' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:03:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:38.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:03:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2255593849' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:03:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3270802321' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:03:38 compute-1 ceph-mon[79770]: pgmap v664: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:03:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2555214126' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:03:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.392 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.393 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.393 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.431 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.431 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.431 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.432 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.432 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.432 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.432 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.432 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.452 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.452 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.452 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.453 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.453 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:03:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:39.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:03:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3965991936' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:03:39 compute-1 nova_compute[228576]: 2025-12-06 10:03:39.912 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:03:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:03:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3965991936' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:03:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:40.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:40 compute-1 nova_compute[228576]: 2025-12-06 10:03:40.098 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:03:40 compute-1 nova_compute[228576]: 2025-12-06 10:03:40.099 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5260MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:03:40 compute-1 nova_compute[228576]: 2025-12-06 10:03:40.100 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:03:40 compute-1 nova_compute[228576]: 2025-12-06 10:03:40.100 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:03:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:40 compute-1 nova_compute[228576]: 2025-12-06 10:03:40.157 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:03:40 compute-1 nova_compute[228576]: 2025-12-06 10:03:40.158 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:03:40 compute-1 nova_compute[228576]: 2025-12-06 10:03:40.172 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:03:40 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:03:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3811939761' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:03:40 compute-1 nova_compute[228576]: 2025-12-06 10:03:40.613 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:03:40 compute-1 nova_compute[228576]: 2025-12-06 10:03:40.620 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:03:40 compute-1 nova_compute[228576]: 2025-12-06 10:03:40.637 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:03:40 compute-1 nova_compute[228576]: 2025-12-06 10:03:40.640 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:03:40 compute-1 nova_compute[228576]: 2025-12-06 10:03:40.640 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.540s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:03:40 compute-1 nova_compute[228576]: 2025-12-06 10:03:40.680 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:40 compute-1 nova_compute[228576]: 2025-12-06 10:03:40.681 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:03:40 compute-1 podman[230273]: 2025-12-06 10:03:40.778412035 +0000 UTC m=+0.074666733 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Dec 06 10:03:40 compute-1 ceph-mon[79770]: pgmap v665: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3811939761' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:03:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003040 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:41 compute-1 sudo[230293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:03:41 compute-1 sudo[230293]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:03:41 compute-1 sudo[230293]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:41.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:42.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:42 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:03:42 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:03:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:43 compute-1 ceph-mon[79770]: pgmap v666: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:03:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:43.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:03:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:03:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:03:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:44.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:03:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:45 compute-1 ceph-mon[79770]: pgmap v667: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:45.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:46.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/2191484556' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:03:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/2191484556' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:03:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:47 compute-1 ceph-mon[79770]: pgmap v668: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:47.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:48.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:03:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:49 compute-1 ceph-mon[79770]: pgmap v669: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:03:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:49.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:03:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:50.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:03:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:50 compute-1 sudo[230323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:03:50 compute-1 sudo[230323]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:03:50 compute-1 sudo[230323]: pam_unix(sudo:session): session closed for user root
Dec 06 10:03:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:51 compute-1 ceph-mon[79770]: pgmap v670: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:03:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:51.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:03:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:52.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224001ee0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:53 compute-1 ceph-mon[79770]: pgmap v671: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:53.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:03:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:54.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:03:54.276 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:03:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:03:54.277 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:03:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:03:54.277 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:03:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:03:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:55 compute-1 ceph-mon[79770]: pgmap v672: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:55.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:03:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:56.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:03:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:57 compute-1 ceph-mon[79770]: pgmap v673: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:03:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:57.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:03:58.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:03:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:03:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:03:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:03:59 compute-1 ceph-mon[79770]: pgmap v674: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:03:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:03:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:03:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:03:59.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:00.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:01 compute-1 ceph-mon[79770]: pgmap v675: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:01.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:04:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:02.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:04:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:02 compute-1 podman[230355]: 2025-12-06 10:04:02.818181929 +0000 UTC m=+0.123200116 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:04:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:03.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:03 compute-1 ceph-mon[79770]: pgmap v676: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:04:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:04.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.796432) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444796541, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2348, "num_deletes": 251, "total_data_size": 6048881, "memory_usage": 6142096, "flush_reason": "Manual Compaction"}
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444818261, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 3957290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20740, "largest_seqno": 23083, "table_properties": {"data_size": 3947958, "index_size": 5826, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19302, "raw_average_key_size": 20, "raw_value_size": 3929213, "raw_average_value_size": 4084, "num_data_blocks": 257, "num_entries": 962, "num_filter_entries": 962, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015229, "oldest_key_time": 1765015229, "file_creation_time": 1765015444, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 21991 microseconds, and 9465 cpu microseconds.
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.818422) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 3957290 bytes OK
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.818491) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.821674) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.821693) EVENT_LOG_v1 {"time_micros": 1765015444821688, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.821711) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6038482, prev total WAL file size 6038482, number of live WAL files 2.
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.823819) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(3864KB)], [39(13MB)]
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444823906, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 17912057, "oldest_snapshot_seqno": -1}
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5482 keys, 15736878 bytes, temperature: kUnknown
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444905853, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 15736878, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15697398, "index_size": 24650, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13765, "raw_key_size": 138128, "raw_average_key_size": 25, "raw_value_size": 15595288, "raw_average_value_size": 2844, "num_data_blocks": 1018, "num_entries": 5482, "num_filter_entries": 5482, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015444, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.906192) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 15736878 bytes
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.907402) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 218.3 rd, 191.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 13.3 +0.0 blob) out(15.0 +0.0 blob), read-write-amplify(8.5) write-amplify(4.0) OK, records in: 5998, records dropped: 516 output_compression: NoCompression
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.907419) EVENT_LOG_v1 {"time_micros": 1765015444907411, "job": 22, "event": "compaction_finished", "compaction_time_micros": 82039, "compaction_time_cpu_micros": 38881, "output_level": 6, "num_output_files": 1, "total_output_size": 15736878, "num_input_records": 5998, "num_output_records": 5482, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444908249, "job": 22, "event": "table_file_deletion", "file_number": 41}
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015444910991, "job": 22, "event": "table_file_deletion", "file_number": 39}
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.823747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.911089) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.911093) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.911094) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.911096) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:04:04 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:04.911097) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:04:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:05.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:05 compute-1 ceph-mon[79770]: pgmap v677: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:06.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:06 compute-1 podman[230384]: 2025-12-06 10:04:06.781011457 +0000 UTC m=+0.080511895 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 06 10:04:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:07.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:07 compute-1 ceph-mon[79770]: pgmap v678: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:08.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:04:08 compute-1 ceph-mon[79770]: pgmap v679: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:04:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:09.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:04:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:10.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002e90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:10 compute-1 ceph-mon[79770]: pgmap v680: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:10 compute-1 sudo[230405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:04:10 compute-1 sudo[230405]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:04:10 compute-1 sudo[230405]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:11 compute-1 podman[230429]: 2025-12-06 10:04:11.010774307 +0000 UTC m=+0.055859233 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:04:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:11.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:12.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002e90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:13 compute-1 ceph-mon[79770]: pgmap v681: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:13.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:04:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:14.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:15 compute-1 ceph-mon[79770]: pgmap v682: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:15.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:16.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:17 compute-1 ceph-mon[79770]: pgmap v683: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:17.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:18.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:04:18 compute-1 ceph-mon[79770]: pgmap v684: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:04:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:19.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:20.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003ba0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:21 compute-1 ceph-mon[79770]: pgmap v685: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:21.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:04:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:22.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:04:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:23 compute-1 ceph-mon[79770]: pgmap v686: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:23.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:04:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0048b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:04:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:24.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:04:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:04:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:25 compute-1 ceph-mon[79770]: pgmap v687: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:25.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:04:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:26.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:04:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0048b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:27 compute-1 ceph-mon[79770]: pgmap v688: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:27.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:28.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280044a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:04:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0048b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:29 compute-1 ceph-mon[79770]: pgmap v689: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:04:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:29.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:30.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:31 compute-1 sudo[230461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:04:31 compute-1 sudo[230461]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:04:31 compute-1 sudo[230461]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280044a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:31 compute-1 ceph-mon[79770]: pgmap v690: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:04:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:31.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:04:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c0048b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:04:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:32.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:04:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004990 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:33 compute-1 ceph-mon[79770]: pgmap v691: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:33.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:33 compute-1 podman[230488]: 2025-12-06 10:04:33.815360492 +0000 UTC m=+0.117960248 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:04:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:04:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280044a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:34.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280044a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:35.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:35 compute-1 ceph-mon[79770]: pgmap v692: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:04:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:36.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:04:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:36 compute-1 nova_compute[228576]: 2025-12-06 10:04:36.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:36 compute-1 nova_compute[228576]: 2025-12-06 10:04:36.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:04:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:37 compute-1 nova_compute[228576]: 2025-12-06 10:04:37.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:37 compute-1 nova_compute[228576]: 2025-12-06 10:04:37.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:04:37 compute-1 nova_compute[228576]: 2025-12-06 10:04:37.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:04:37 compute-1 nova_compute[228576]: 2025-12-06 10:04:37.500 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:04:37 compute-1 nova_compute[228576]: 2025-12-06 10:04:37.500 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:37 compute-1 nova_compute[228576]: 2025-12-06 10:04:37.500 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:37 compute-1 nova_compute[228576]: 2025-12-06 10:04:37.523 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:04:37 compute-1 nova_compute[228576]: 2025-12-06 10:04:37.524 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:04:37 compute-1 nova_compute[228576]: 2025-12-06 10:04:37.524 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:04:37 compute-1 nova_compute[228576]: 2025-12-06 10:04:37.524 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:04:37 compute-1 nova_compute[228576]: 2025-12-06 10:04:37.524 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.716397) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477716439, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 532, "num_deletes": 251, "total_data_size": 842012, "memory_usage": 852240, "flush_reason": "Manual Compaction"}
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477721668, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 393629, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23088, "largest_seqno": 23615, "table_properties": {"data_size": 391065, "index_size": 600, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6539, "raw_average_key_size": 19, "raw_value_size": 385955, "raw_average_value_size": 1148, "num_data_blocks": 27, "num_entries": 336, "num_filter_entries": 336, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015445, "oldest_key_time": 1765015445, "file_creation_time": 1765015477, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 5327 microseconds, and 2267 cpu microseconds.
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.721724) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 393629 bytes OK
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.721750) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.723114) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.723131) EVENT_LOG_v1 {"time_micros": 1765015477723125, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.723169) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 838913, prev total WAL file size 838913, number of live WAL files 2.
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.723740) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353030' seq:72057594037927935, type:22 .. '6D67727374617400373532' seq:0, type:0; will stop at (end)
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(384KB)], [42(15MB)]
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477723772, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 16130507, "oldest_snapshot_seqno": -1}
Dec 06 10:04:37 compute-1 podman[230528]: 2025-12-06 10:04:37.752822472 +0000 UTC m=+0.060298222 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5318 keys, 12216452 bytes, temperature: kUnknown
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477988553, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 12216452, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12182362, "index_size": 19708, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13317, "raw_key_size": 135099, "raw_average_key_size": 25, "raw_value_size": 12087336, "raw_average_value_size": 2272, "num_data_blocks": 802, "num_entries": 5318, "num_filter_entries": 5318, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015477, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:04:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:37.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.988807) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 12216452 bytes
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.992719) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 60.9 rd, 46.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 15.0 +0.0 blob) out(11.7 +0.0 blob), read-write-amplify(72.0) write-amplify(31.0) OK, records in: 5818, records dropped: 500 output_compression: NoCompression
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.992737) EVENT_LOG_v1 {"time_micros": 1765015477992728, "job": 24, "event": "compaction_finished", "compaction_time_micros": 264869, "compaction_time_cpu_micros": 28898, "output_level": 6, "num_output_files": 1, "total_output_size": 12216452, "num_input_records": 5818, "num_output_records": 5318, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477993071, "job": 24, "event": "table_file_deletion", "file_number": 44}
Dec 06 10:04:37 compute-1 ceph-mon[79770]: pgmap v693: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015477996896, "job": 24, "event": "table_file_deletion", "file_number": 42}
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.723683) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.996953) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.996956) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.996958) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.996959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:04:37 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:04:37.996961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:04:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:38.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:04:38 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2156903250' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:04:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:38 compute-1 nova_compute[228576]: 2025-12-06 10:04:38.214 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.689s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:04:38 compute-1 nova_compute[228576]: 2025-12-06 10:04:38.392 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:04:38 compute-1 nova_compute[228576]: 2025-12-06 10:04:38.395 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5253MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:04:38 compute-1 nova_compute[228576]: 2025-12-06 10:04:38.395 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:04:38 compute-1 nova_compute[228576]: 2025-12-06 10:04:38.395 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:04:38 compute-1 nova_compute[228576]: 2025-12-06 10:04:38.459 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:04:38 compute-1 nova_compute[228576]: 2025-12-06 10:04:38.460 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:04:38 compute-1 nova_compute[228576]: 2025-12-06 10:04:38.477 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:04:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - - [06/Dec/2025:10:04:38.730 +0000] "GET /swift/info HTTP/1.1" 200 539 - "python-urllib3/1.26.5" - latency=0.001000024s
Dec 06 10:04:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:04:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:04:38 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2295503369' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:04:38 compute-1 nova_compute[228576]: 2025-12-06 10:04:38.934 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:04:38 compute-1 nova_compute[228576]: 2025-12-06 10:04:38.940 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:04:38 compute-1 nova_compute[228576]: 2025-12-06 10:04:38.961 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:04:38 compute-1 nova_compute[228576]: 2025-12-06 10:04:38.963 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:04:38 compute-1 nova_compute[228576]: 2025-12-06 10:04:38.963 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:04:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2156903250' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:04:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3692914369' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:04:39 compute-1 ceph-mon[79770]: pgmap v694: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:04:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2295503369' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:04:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:04:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/870583344' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:04:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:39 compute-1 nova_compute[228576]: 2025-12-06 10:04:39.934 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:39 compute-1 nova_compute[228576]: 2025-12-06 10:04:39.961 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:39 compute-1 nova_compute[228576]: 2025-12-06 10:04:39.961 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:39 compute-1 nova_compute[228576]: 2025-12-06 10:04:39.961 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:39.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1352342801' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:04:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1282252340' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:04:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:40.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:40 compute-1 nova_compute[228576]: 2025-12-06 10:04:40.489 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:41 compute-1 ceph-mon[79770]: pgmap v695: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:41 compute-1 nova_compute[228576]: 2025-12-06 10:04:41.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:04:41 compute-1 sudo[230581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:41 compute-1 sudo[230581]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:04:41 compute-1 sudo[230581]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:41 compute-1 podman[230582]: 2025-12-06 10:04:41.755249777 +0000 UTC m=+0.061122712 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:04:41 compute-1 sudo[230623]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 06 10:04:41 compute-1 sudo[230623]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:04:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:41.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:42.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:42 compute-1 podman[230725]: 2025-12-06 10:04:42.319827718 +0000 UTC m=+0.066587794 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Dec 06 10:04:42 compute-1 podman[230725]: 2025-12-06 10:04:42.415171334 +0000 UTC m=+0.161931430 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 10:04:42 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 06 10:04:42 compute-1 podman[230839]: 2025-12-06 10:04:42.914264631 +0000 UTC m=+0.058245086 container exec 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:04:42 compute-1 podman[230839]: 2025-12-06 10:04:42.927528031 +0000 UTC m=+0.071508476 container exec_died 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:04:43 compute-1 podman[230928]: 2025-12-06 10:04:43.243389139 +0000 UTC m=+0.052445746 container exec cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 10:04:43 compute-1 podman[230928]: 2025-12-06 10:04:43.257707414 +0000 UTC m=+0.066764031 container exec_died cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1)
Dec 06 10:04:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:43 compute-1 podman[230993]: 2025-12-06 10:04:43.453530017 +0000 UTC m=+0.047496506 container exec 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec 06 10:04:43 compute-1 ceph-mon[79770]: pgmap v696: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:04:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e147 e147: 3 total, 3 up, 3 in
Dec 06 10:04:43 compute-1 podman[230993]: 2025-12-06 10:04:43.500683554 +0000 UTC m=+0.094650033 container exec_died 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec 06 10:04:43 compute-1 podman[231060]: 2025-12-06 10:04:43.693564816 +0000 UTC m=+0.049147126 container exec c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, architecture=x86_64, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, version=2.2.4, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, io.buildah.version=1.28.2, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Dec 06 10:04:43 compute-1 podman[231060]: 2025-12-06 10:04:43.707489262 +0000 UTC m=+0.063071562 container exec_died c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, io.openshift.expose-services=, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, release=1793, vcs-type=git, name=keepalived, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, architecture=x86_64, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, vendor=Red Hat, Inc., version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Dec 06 10:04:43 compute-1 sudo[230623]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:04:43 compute-1 sudo[231093]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:04:43 compute-1 sudo[231093]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:04:43 compute-1 sudo[231093]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:43 compute-1 sudo[231118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:04:43 compute-1 sudo[231118]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:04:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:43.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:44.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:44 compute-1 sudo[231118]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:44 compute-1 ceph-mon[79770]: osdmap e147: 3 total, 3 up, 3 in
Dec 06 10:04:44 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:04:44 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:04:44 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:04:44 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:04:44 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e148 e148: 3 total, 3 up, 3 in
Dec 06 10:04:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:45 compute-1 ceph-mon[79770]: pgmap v698: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 307 B/s rd, 0 op/s
Dec 06 10:04:45 compute-1 ceph-mon[79770]: osdmap e148: 3 total, 3 up, 3 in
Dec 06 10:04:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec 06 10:04:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec 06 10:04:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:04:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:04:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:04:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:04:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:04:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:04:45 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:04:45 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e149 e149: 3 total, 3 up, 3 in
Dec 06 10:04:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:45.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:46.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:47 compute-1 ceph-mon[79770]: osdmap e149: 3 total, 3 up, 3 in
Dec 06 10:04:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3880271287' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:04:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3880271287' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:04:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:48.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:48 compute-1 ceph-mon[79770]: pgmap v701: 337 pgs: 337 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail
Dec 06 10:04:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:48.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e150 e150: 3 total, 3 up, 3 in
Dec 06 10:04:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:04:49 compute-1 ceph-mon[79770]: osdmap e150: 3 total, 3 up, 3 in
Dec 06 10:04:49 compute-1 ceph-mon[79770]: pgmap v703: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 47 KiB/s rd, 8.3 MiB/s wr, 68 op/s
Dec 06 10:04:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:49 compute-1 sudo[231177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:04:49 compute-1 sudo[231177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:04:49 compute-1 sudo[231177]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:50.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:50.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:50 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:04:50 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:04:51 compute-1 sudo[231203]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:04:51 compute-1 sudo[231203]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:04:51 compute-1 sudo[231203]: pam_unix(sudo:session): session closed for user root
Dec 06 10:04:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:51 compute-1 ceph-mon[79770]: pgmap v704: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 6.8 MiB/s wr, 56 op/s
Dec 06 10:04:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:52.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:52.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 e151: 3 total, 3 up, 3 in
Dec 06 10:04:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:04:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:54.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:54 compute-1 ceph-mon[79770]: pgmap v705: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 5.2 MiB/s wr, 42 op/s
Dec 06 10:04:54 compute-1 ceph-mon[79770]: osdmap e151: 3 total, 3 up, 3 in
Dec 06 10:04:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:54.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:04:54.278 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:04:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:04:54.279 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:04:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:04:54.279 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:04:55 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:04:55 compute-1 ceph-mon[79770]: pgmap v707: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 48 op/s
Dec 06 10:04:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:04:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:56.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:04:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:56.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:57 compute-1 ceph-mon[79770]: pgmap v708: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 4.0 KiB/s rd, 621 B/s wr, 5 op/s
Dec 06 10:04:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:04:58.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:04:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:04:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:04:58.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:04:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:04:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:04:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:04:59 compute-1 ceph-mon[79770]: pgmap v709: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 3.6 KiB/s rd, 511 B/s wr, 4 op/s
Dec 06 10:05:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:00.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:00.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004a40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:01 compute-1 ceph-mon[79770]: pgmap v710: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 3.6 KiB/s rd, 511 B/s wr, 4 op/s
Dec 06 10:05:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:02.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:02.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:03 compute-1 ceph-mon[79770]: pgmap v711: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 3.6 KiB/s rd, 511 B/s wr, 4 op/s
Dec 06 10:05:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:05:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:04.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004be0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:04.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:04 compute-1 podman[231235]: 2025-12-06 10:05:04.823247206 +0000 UTC m=+0.115002824 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:05:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:05 compute-1 ceph-mon[79770]: pgmap v712: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 438 B/s wr, 4 op/s
Dec 06 10:05:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:06.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:06.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c001b90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:07 compute-1 ceph-mon[79770]: pgmap v713: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:05:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:08 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:05:08.038 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:05:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:08.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:08 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:05:08.039 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:05:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:08.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:08 compute-1 podman[231265]: 2025-12-06 10:05:08.76734588 +0000 UTC m=+0.071161747 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:05:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:05:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c001b90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:09 compute-1 ceph-mon[79770]: pgmap v714: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:05:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:05:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:10.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c001b90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:10.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c001b90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100510 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:05:11 compute-1 sudo[231286]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:05:11 compute-1 sudo[231286]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:05:11 compute-1 sudo[231286]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:11 compute-1 ceph-mon[79770]: pgmap v715: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:05:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:05:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:12.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:05:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:12.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:12 compute-1 podman[231312]: 2025-12-06 10:05:12.771059992 +0000 UTC m=+0.079605201 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 06 10:05:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c001b90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:13 compute-1 ceph-mon[79770]: pgmap v716: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:05:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:05:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:14.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:14.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100515 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:05:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:15 compute-1 ceph-mon[79770]: pgmap v717: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:05:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:16.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c001b90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:05:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:16.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:05:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:17 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:05:17.041 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:05:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:18.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:18 compute-1 ceph-mon[79770]: pgmap v718: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:05:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:18.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:05:18 compute-1 sshd-session[231335]: Connection closed by authenticating user root 45.10.175.77 port 34246 [preauth]
Dec 06 10:05:19 compute-1 ceph-mon[79770]: pgmap v719: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:05:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:20.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:05:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:20.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:21 compute-1 ceph-mon[79770]: pgmap v720: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:05:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:22.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:22.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:05:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:05:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:23 compute-1 ceph-mon[79770]: pgmap v721: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:05:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:05:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:05:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:24.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:24.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:05:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004200 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:25 compute-1 ceph-mon[79770]: pgmap v722: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 511 B/s wr, 1 op/s
Dec 06 10:05:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:26.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:26.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 10:05:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:27 compute-1 ceph-mon[79770]: pgmap v723: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 511 B/s wr, 1 op/s
Dec 06 10:05:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:28.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004220 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:28.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:05:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:05:29 compute-1 ceph-mon[79770]: pgmap v724: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 10:05:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:30.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:30.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004240 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:31 compute-1 sudo[231344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:05:31 compute-1 sudo[231344]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:05:31 compute-1 sudo[231344]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:31 compute-1 ceph-mon[79770]: pgmap v725: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 10:05:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:32.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:32.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100532 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 10:05:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:05:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:05:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004260 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:33 compute-1 ceph-mon[79770]: pgmap v726: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 3 op/s
Dec 06 10:05:33 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2348345609' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:05:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:05:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:34.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:34.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 10:05:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:35 compute-1 ceph-mon[79770]: pgmap v727: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 3.6 KiB/s rd, 1.6 KiB/s wr, 5 op/s
Dec 06 10:05:35 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e152 e152: 3 total, 3 up, 3 in
Dec 06 10:05:35 compute-1 podman[231371]: 2025-12-06 10:05:35.847972234 +0000 UTC m=+0.141137805 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:05:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:36.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004280 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:36.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:36 compute-1 nova_compute[228576]: 2025-12-06 10:05:36.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:36 compute-1 nova_compute[228576]: 2025-12-06 10:05:36.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:05:36 compute-1 nova_compute[228576]: 2025-12-06 10:05:36.472 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:36 compute-1 nova_compute[228576]: 2025-12-06 10:05:36.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:05:36 compute-1 nova_compute[228576]: 2025-12-06 10:05:36.502 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:05:36 compute-1 nova_compute[228576]: 2025-12-06 10:05:36.503 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:36 compute-1 nova_compute[228576]: 2025-12-06 10:05:36.504 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:05:36 compute-1 nova_compute[228576]: 2025-12-06 10:05:36.518 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:36 compute-1 ceph-mon[79770]: osdmap e152: 3 total, 3 up, 3 in
Dec 06 10:05:36 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e153 e153: 3 total, 3 up, 3 in
Dec 06 10:05:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100537 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 10:05:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:37 compute-1 nova_compute[228576]: 2025-12-06 10:05:37.526 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:37 compute-1 nova_compute[228576]: 2025-12-06 10:05:37.527 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:05:37 compute-1 nova_compute[228576]: 2025-12-06 10:05:37.527 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:05:37 compute-1 nova_compute[228576]: 2025-12-06 10:05:37.546 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:05:37 compute-1 nova_compute[228576]: 2025-12-06 10:05:37.547 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:37 compute-1 ceph-mon[79770]: pgmap v729: 337 pgs: 337 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 3.5 KiB/s rd, 1.4 KiB/s wr, 5 op/s
Dec 06 10:05:37 compute-1 ceph-mon[79770]: osdmap e153: 3 total, 3 up, 3 in
Dec 06 10:05:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2654563727' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:05:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:38.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:38.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:38 compute-1 nova_compute[228576]: 2025-12-06 10:05:38.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/82878470' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:05:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/551840607' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:05:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:05:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:39 compute-1 nova_compute[228576]: 2025-12-06 10:05:39.469 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:39 compute-1 nova_compute[228576]: 2025-12-06 10:05:39.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:39 compute-1 nova_compute[228576]: 2025-12-06 10:05:39.492 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:05:39 compute-1 nova_compute[228576]: 2025-12-06 10:05:39.492 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:05:39 compute-1 nova_compute[228576]: 2025-12-06 10:05:39.493 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:05:39 compute-1 nova_compute[228576]: 2025-12-06 10:05:39.493 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:05:39 compute-1 nova_compute[228576]: 2025-12-06 10:05:39.493 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:05:39 compute-1 podman[231423]: 2025-12-06 10:05:39.757778631 +0000 UTC m=+0.061746820 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:05:39 compute-1 ceph-mon[79770]: pgmap v731: 337 pgs: 337 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 54 op/s
Dec 06 10:05:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:05:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2829121739' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:05:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:05:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3018802899' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:05:39 compute-1 nova_compute[228576]: 2025-12-06 10:05:39.994 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:05:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:40.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.142 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.143 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5250MB free_disk=59.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.144 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.144 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:05:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:40.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.246 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.247 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:05:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.290 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing inventories for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.373 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating ProviderTree inventory for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.374 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating inventory in ProviderTree for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.396 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing aggregate associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.426 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing trait associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, traits: COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AESNI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SHA,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_BMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.446 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:05:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3018802899' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:05:40 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:05:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2320957764' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.931 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.937 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.950 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.951 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:05:40 compute-1 nova_compute[228576]: 2025-12-06 10:05:40.952 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.807s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:05:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:41 compute-1 ceph-mon[79770]: pgmap v732: 337 pgs: 337 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 54 op/s
Dec 06 10:05:41 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2320957764' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:05:41 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3766974847' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:05:41 compute-1 nova_compute[228576]: 2025-12-06 10:05:41.953 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:42.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:42.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:42 compute-1 nova_compute[228576]: 2025-12-06 10:05:42.463 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 e154: 3 total, 3 up, 3 in
Dec 06 10:05:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3852513152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:05:42 compute-1 ceph-mon[79770]: osdmap e154: 3 total, 3 up, 3 in
Dec 06 10:05:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:43 compute-1 nova_compute[228576]: 2025-12-06 10:05:43.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:05:43 compute-1 podman[231468]: 2025-12-06 10:05:43.757057967 +0000 UTC m=+0.062444177 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:05:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:05:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:44.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:44.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:45 compute-1 ceph-mon[79770]: pgmap v733: 337 pgs: 337 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 51 op/s
Dec 06 10:05:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:46.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:46.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:46 compute-1 ceph-mon[79770]: pgmap v735: 337 pgs: 337 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 5.1 MiB/s rd, 2.7 MiB/s wr, 151 op/s
Dec 06 10:05:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:47 compute-1 ceph-mon[79770]: pgmap v736: 337 pgs: 337 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 4.2 MiB/s rd, 2.2 MiB/s wr, 124 op/s
Dec 06 10:05:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/850046515' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:05:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/850046515' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:05:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:48.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:48.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:05:48 compute-1 ceph-mon[79770]: pgmap v737: 337 pgs: 337 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 15 KiB/s wr, 88 op/s
Dec 06 10:05:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:50 compute-1 sudo[231494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:05:50 compute-1 sudo[231494]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:05:50 compute-1 sudo[231494]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:50.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:50 compute-1 sudo[231519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:05:50 compute-1 sudo[231519]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:05:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:50.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:50 compute-1 sudo[231519]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:51 compute-1 sudo[231576]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:05:51 compute-1 sudo[231576]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:05:51 compute-1 sudo[231576]: pam_unix(sudo:session): session closed for user root
Dec 06 10:05:51 compute-1 ceph-mon[79770]: pgmap v738: 337 pgs: 337 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 15 KiB/s wr, 88 op/s
Dec 06 10:05:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:52.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:52.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100552 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.795525) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552795571, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 1149, "num_deletes": 256, "total_data_size": 2543773, "memory_usage": 2579736, "flush_reason": "Manual Compaction"}
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552807890, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1680401, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23620, "largest_seqno": 24764, "table_properties": {"data_size": 1675158, "index_size": 2703, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11296, "raw_average_key_size": 19, "raw_value_size": 1664427, "raw_average_value_size": 2864, "num_data_blocks": 118, "num_entries": 581, "num_filter_entries": 581, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015477, "oldest_key_time": 1765015477, "file_creation_time": 1765015552, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 12428 microseconds, and 6182 cpu microseconds.
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.807953) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1680401 bytes OK
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.807980) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.809488) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.809501) EVENT_LOG_v1 {"time_micros": 1765015552809497, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.809522) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2538114, prev total WAL file size 2538114, number of live WAL files 2.
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.810335) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323532' seq:72057594037927935, type:22 .. '6C6F676D00353034' seq:0, type:0; will stop at (end)
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(1641KB)], [45(11MB)]
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552810427, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 13896853, "oldest_snapshot_seqno": -1}
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5365 keys, 13714802 bytes, temperature: kUnknown
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552872926, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13714802, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13678587, "index_size": 21705, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13445, "raw_key_size": 137354, "raw_average_key_size": 25, "raw_value_size": 13580964, "raw_average_value_size": 2531, "num_data_blocks": 884, "num_entries": 5365, "num_filter_entries": 5365, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015552, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.873279) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13714802 bytes
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.874844) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 222.1 rd, 219.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 11.7 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(16.4) write-amplify(8.2) OK, records in: 5899, records dropped: 534 output_compression: NoCompression
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.874863) EVENT_LOG_v1 {"time_micros": 1765015552874855, "job": 26, "event": "compaction_finished", "compaction_time_micros": 62581, "compaction_time_cpu_micros": 26986, "output_level": 6, "num_output_files": 1, "total_output_size": 13714802, "num_input_records": 5899, "num_output_records": 5365, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552875318, "job": 26, "event": "table_file_deletion", "file_number": 47}
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015552877651, "job": 26, "event": "table_file_deletion", "file_number": 45}
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.810273) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.877747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.877754) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.877756) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.877758) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:05:52 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:05:52.877760) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:05:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:53 compute-1 ceph-mon[79770]: pgmap v739: 337 pgs: 337 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 15 KiB/s wr, 88 op/s
Dec 06 10:05:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:05:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:54.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:54.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:05:54.279 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:05:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:05:54.280 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:05:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:05:54.280 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:05:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:05:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:05:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:05:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:05:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:05:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:05:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:05:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:05:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:05:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:05:54 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec 06 10:05:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:55 compute-1 ceph-mon[79770]: pgmap v740: 337 pgs: 337 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 13 KiB/s wr, 77 op/s
Dec 06 10:05:55 compute-1 ceph-mon[79770]: pgmap v741: 337 pgs: 337 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 266 KiB/s rd, 9 op/s
Dec 06 10:05:55 compute-1 ceph-mon[79770]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Dec 06 10:05:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:56.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:05:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:56.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:05:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:57 compute-1 ceph-mon[79770]: pgmap v742: 337 pgs: 337 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 266 KiB/s rd, 9 op/s
Dec 06 10:05:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:05:58.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:05:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:05:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:05:58.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:05:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:05:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:05:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:05:59 compute-1 ceph-mon[79770]: pgmap v743: 337 pgs: 337 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 378 KiB/s rd, 2.5 MiB/s wr, 74 op/s
Dec 06 10:06:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:00.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:00.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:00 compute-1 sudo[231606]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:06:00 compute-1 sudo[231606]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:06:00 compute-1 sudo[231606]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:06:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:06:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:06:01 compute-1 ceph-mon[79770]: pgmap v744: 337 pgs: 337 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 378 KiB/s rd, 2.5 MiB/s wr, 74 op/s
Dec 06 10:06:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:02.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:02.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:03 compute-1 ceph-mon[79770]: pgmap v745: 337 pgs: 337 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 378 KiB/s rd, 2.5 MiB/s wr, 74 op/s
Dec 06 10:06:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:06:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:06:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:06:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:04.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:04.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:05 compute-1 ceph-mon[79770]: pgmap v746: 337 pgs: 337 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 382 KiB/s rd, 2.5 MiB/s wr, 75 op/s
Dec 06 10:06:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:06.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:06.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:06 compute-1 podman[231634]: 2025-12-06 10:06:06.788121574 +0000 UTC m=+0.085263728 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:06:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 10:06:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:07 compute-1 ceph-mon[79770]: pgmap v747: 337 pgs: 337 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 329 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 06 10:06:08 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:06:08.105 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:06:08 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:06:08.107 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:06:08 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:06:08.108 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:06:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:08.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004520 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:08.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:06:09 compute-1 ceph-mon[79770]: pgmap v748: 337 pgs: 337 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 330 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Dec 06 10:06:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:06:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:06:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:10.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:10.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:10 compute-1 podman[231665]: 2025-12-06 10:06:10.737117247 +0000 UTC m=+0.047280152 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:06:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:11 compute-1 sudo[231685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:06:11 compute-1 sudo[231685]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:06:11 compute-1 sudo[231685]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:11 compute-1 ceph-mon[79770]: pgmap v749: 337 pgs: 337 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 7.4 KiB/s rd, 17 KiB/s wr, 3 op/s
Dec 06 10:06:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:12.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100612 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 10:06:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:12.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:13 compute-1 ceph-mon[79770]: pgmap v750: 337 pgs: 337 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 7.4 KiB/s rd, 17 KiB/s wr, 3 op/s
Dec 06 10:06:13 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4031308316' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:06:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:06:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:14.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:14.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:14 compute-1 podman[231712]: 2025-12-06 10:06:14.779113722 +0000 UTC m=+0.076885866 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 10:06:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:15 compute-1 ceph-mon[79770]: pgmap v751: 337 pgs: 337 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 7.5 KiB/s rd, 18 KiB/s wr, 4 op/s
Dec 06 10:06:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:16.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:16.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:17 compute-1 ceph-mon[79770]: pgmap v752: 337 pgs: 337 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 4.7 KiB/s wr, 2 op/s
Dec 06 10:06:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:18.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002550 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:18.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:06:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:19 compute-1 ceph-mon[79770]: pgmap v753: 337 pgs: 337 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 36 op/s
Dec 06 10:06:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:20.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:20.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:20 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1556517146' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:06:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:21 compute-1 ceph-mon[79770]: pgmap v754: 337 pgs: 337 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Dec 06 10:06:21 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3225489691' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:06:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:22.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:22.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:22 compute-1 ceph-mon[79770]: pgmap v755: 337 pgs: 337 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Dec 06 10:06:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:06:23 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:06:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:24.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003a20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:24.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:24 compute-1 ceph-mon[79770]: pgmap v756: 337 pgs: 337 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Dec 06 10:06:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:26.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:26.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:27 compute-1 ceph-mon[79770]: pgmap v757: 337 pgs: 337 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Dec 06 10:06:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:28.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:28.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:06:28 compute-1 ceph-mon[79770]: pgmap v758: 337 pgs: 337 active+clean; 167 MiB data, 320 MiB used, 60 GiB / 60 GiB avail; 3.6 MiB/s rd, 1.8 MiB/s wr, 108 op/s
Dec 06 10:06:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:30.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:30.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:31 compute-1 sudo[231740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:06:31 compute-1 sudo[231740]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:06:31 compute-1 sudo[231740]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:31 compute-1 ceph-mon[79770]: pgmap v759: 337 pgs: 337 active+clean; 167 MiB data, 320 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 74 op/s
Dec 06 10:06:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:32.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:32.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:06:33 compute-1 ceph-mon[79770]: pgmap v760: 337 pgs: 337 active+clean; 167 MiB data, 320 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 74 op/s
Dec 06 10:06:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:34.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:34.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:36 compute-1 ceph-mon[79770]: pgmap v761: 337 pgs: 337 active+clean; 167 MiB data, 320 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 74 op/s
Dec 06 10:06:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:36.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:36.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:36 compute-1 nova_compute[228576]: 2025-12-06 10:06:36.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:36 compute-1 nova_compute[228576]: 2025-12-06 10:06:36.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:06:37 compute-1 ceph-mon[79770]: pgmap v762: 337 pgs: 337 active+clean; 167 MiB data, 320 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 74 op/s
Dec 06 10:06:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:37 compute-1 podman[231768]: 2025-12-06 10:06:37.779296653 +0000 UTC m=+0.088654289 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:06:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:38.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:38.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:06:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:39 compute-1 nova_compute[228576]: 2025-12-06 10:06:39.469 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:39 compute-1 nova_compute[228576]: 2025-12-06 10:06:39.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:06:39 compute-1 nova_compute[228576]: 2025-12-06 10:06:39.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:06:39 compute-1 nova_compute[228576]: 2025-12-06 10:06:39.510 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:06:39 compute-1 nova_compute[228576]: 2025-12-06 10:06:39.511 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:39 compute-1 nova_compute[228576]: 2025-12-06 10:06:39.511 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:39 compute-1 nova_compute[228576]: 2025-12-06 10:06:39.539 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:06:39 compute-1 nova_compute[228576]: 2025-12-06 10:06:39.541 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:06:39 compute-1 nova_compute[228576]: 2025-12-06 10:06:39.542 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:06:39 compute-1 nova_compute[228576]: 2025-12-06 10:06:39.543 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:06:39 compute-1 nova_compute[228576]: 2025-12-06 10:06:39.545 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:06:39 compute-1 ceph-mon[79770]: pgmap v763: 337 pgs: 337 active+clean; 195 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 128 op/s
Dec 06 10:06:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:06:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:06:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/172757125' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:06:39 compute-1 nova_compute[228576]: 2025-12-06 10:06:39.984 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:06:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:06:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:40.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:06:40 compute-1 nova_compute[228576]: 2025-12-06 10:06:40.195 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:06:40 compute-1 nova_compute[228576]: 2025-12-06 10:06:40.197 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5203MB free_disk=59.897621154785156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:06:40 compute-1 nova_compute[228576]: 2025-12-06 10:06:40.197 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:06:40 compute-1 nova_compute[228576]: 2025-12-06 10:06:40.197 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:06:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004890 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:40 compute-1 nova_compute[228576]: 2025-12-06 10:06:40.270 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:06:40 compute-1 nova_compute[228576]: 2025-12-06 10:06:40.271 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:06:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100640 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:06:40 compute-1 nova_compute[228576]: 2025-12-06 10:06:40.297 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:06:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:40.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:40 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:06:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1754887358' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:06:40 compute-1 nova_compute[228576]: 2025-12-06 10:06:40.755 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:06:40 compute-1 nova_compute[228576]: 2025-12-06 10:06:40.760 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:06:40 compute-1 nova_compute[228576]: 2025-12-06 10:06:40.773 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:06:40 compute-1 nova_compute[228576]: 2025-12-06 10:06:40.775 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:06:40 compute-1 nova_compute[228576]: 2025-12-06 10:06:40.775 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.578s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:06:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/172757125' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:06:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1103758457' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:06:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1754887358' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:06:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:41 compute-1 nova_compute[228576]: 2025-12-06 10:06:41.734 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:41 compute-1 nova_compute[228576]: 2025-12-06 10:06:41.753 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:41 compute-1 nova_compute[228576]: 2025-12-06 10:06:41.753 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:41 compute-1 nova_compute[228576]: 2025-12-06 10:06:41.754 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:41 compute-1 podman[231841]: 2025-12-06 10:06:41.755398169 +0000 UTC m=+0.056313200 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 06 10:06:41 compute-1 ceph-mon[79770]: pgmap v764: 337 pgs: 337 active+clean; 195 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 330 KiB/s rd, 2.1 MiB/s wr, 53 op/s
Dec 06 10:06:41 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1975280571' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:06:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:42.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:42.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:42 compute-1 nova_compute[228576]: 2025-12-06 10:06:42.483 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:43 compute-1 nova_compute[228576]: 2025-12-06 10:06:43.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:06:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:06:43 compute-1 ceph-mon[79770]: pgmap v765: 337 pgs: 337 active+clean; 195 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 330 KiB/s rd, 2.1 MiB/s wr, 53 op/s
Dec 06 10:06:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:44.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:44.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3034468691' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:06:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:45 compute-1 podman[231862]: 2025-12-06 10:06:45.763115678 +0000 UTC m=+0.068081453 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:06:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:46.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:46.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:47 compute-1 ceph-mon[79770]: pgmap v766: 337 pgs: 337 active+clean; 200 MiB data, 345 MiB used, 60 GiB / 60 GiB avail; 391 KiB/s rd, 2.1 MiB/s wr, 67 op/s
Dec 06 10:06:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/490054839' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:06:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2091989865' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:06:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:48.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:48.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:48 compute-1 ceph-mon[79770]: pgmap v767: 337 pgs: 337 active+clean; 200 MiB data, 345 MiB used, 60 GiB / 60 GiB avail; 390 KiB/s rd, 2.1 MiB/s wr, 67 op/s
Dec 06 10:06:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3518606672' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:06:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3518606672' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:06:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:06:49 compute-1 ceph-mon[79770]: pgmap v768: 337 pgs: 337 active+clean; 178 MiB data, 345 MiB used, 60 GiB / 60 GiB avail; 399 KiB/s rd, 2.1 MiB/s wr, 79 op/s
Dec 06 10:06:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:50.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003760 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:50.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:50 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3586066368' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:06:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:06:51 compute-1 ceph-mon[79770]: pgmap v769: 337 pgs: 337 active+clean; 178 MiB data, 345 MiB used, 60 GiB / 60 GiB avail; 70 KiB/s rd, 70 KiB/s wr, 25 op/s
Dec 06 10:06:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:51 compute-1 sudo[231885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:06:51 compute-1 sudo[231885]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:06:51 compute-1 sudo[231885]: pam_unix(sudo:session): session closed for user root
Dec 06 10:06:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:52.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180046f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:52.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003760 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:53 compute-1 ceph-mon[79770]: pgmap v770: 337 pgs: 337 active+clean; 178 MiB data, 345 MiB used, 60 GiB / 60 GiB avail; 70 KiB/s rd, 70 KiB/s wr, 25 op/s
Dec 06 10:06:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:06:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:06:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:06:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:54.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:06:54.280 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:06:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:06:54.281 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:06:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:06:54.281 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:06:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:54.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:06:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100655 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:06:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004710 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:55 compute-1 ceph-mon[79770]: pgmap v771: 337 pgs: 337 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 81 KiB/s rd, 71 KiB/s wr, 43 op/s
Dec 06 10:06:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/604483348' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:06:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:56.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003760 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:56.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:06:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:06:58.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:06:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004730 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:06:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:06:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:06:58.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:06:58 compute-1 ceph-mon[79770]: pgmap v772: 337 pgs: 337 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 15 KiB/s wr, 29 op/s
Dec 06 10:06:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:06:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:06:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:06:59 compute-1 ceph-mon[79770]: pgmap v773: 337 pgs: 337 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 16 KiB/s wr, 58 op/s
Dec 06 10:07:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:00.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004750 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:00.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:07:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:07:00 compute-1 sudo[231916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:07:00 compute-1 sudo[231916]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:07:00 compute-1 sudo[231916]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:00 compute-1 sudo[231941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:07:00 compute-1 sudo[231941]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:07:01 compute-1 sudo[231941]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:01 compute-1 ceph-mon[79770]: pgmap v774: 337 pgs: 337 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 2.4 KiB/s wr, 45 op/s
Dec 06 10:07:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:07:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:07:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:07:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:07:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:07:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:07:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:07:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:02.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400053e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:02.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:02 compute-1 ceph-mon[79770]: pgmap v775: 337 pgs: 337 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 34 KiB/s rd, 2.7 KiB/s wr, 51 op/s
Dec 06 10:07:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004770 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 10:07:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:07:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:07:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:04.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:04.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:04 compute-1 ceph-mon[79770]: pgmap v776: 337 pgs: 337 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 35 KiB/s rd, 3.1 KiB/s wr, 52 op/s
Dec 06 10:07:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005400 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:06.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004790 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:06.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:07:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:07:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:07:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:07:06 compute-1 sudo[232001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:07:06 compute-1 sudo[232001]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:07:06 compute-1 sudo[232001]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:06 compute-1 ceph-mon[79770]: pgmap v777: 337 pgs: 337 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 1.7 KiB/s wr, 33 op/s
Dec 06 10:07:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:07:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:07:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:08.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005420 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180047b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:08.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:08 compute-1 podman[232027]: 2025-12-06 10:07:08.777964173 +0000 UTC m=+0.082493851 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Dec 06 10:07:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:07:08 compute-1 ceph-mon[79770]: pgmap v778: 337 pgs: 337 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 1.7 KiB/s wr, 33 op/s
Dec 06 10:07:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 10:07:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:07:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:10.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005440 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100710 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 10:07:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:10.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:10 compute-1 ceph-mon[79770]: pgmap v779: 337 pgs: 337 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Dec 06 10:07:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180047d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:11 compute-1 sudo[232055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:07:11 compute-1 sudo[232055]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:07:11 compute-1 sudo[232055]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:11 compute-1 ceph-mon[79770]: pgmap v780: 337 pgs: 337 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 1.3 KiB/s wr, 4 op/s
Dec 06 10:07:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:12.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c0045f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:12.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:12 compute-1 podman[232081]: 2025-12-06 10:07:12.769403058 +0000 UTC m=+0.065640004 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:07:12 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:07:12.941 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:07:12 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:07:12.942 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:07:12 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:07:12.943 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:07:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005460 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:07:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:14.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180047f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002180 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:14.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:14 compute-1 ceph-mon[79770]: pgmap v781: 337 pgs: 337 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 3.4 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Dec 06 10:07:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100715 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 10:07:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:16.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005480 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180047f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:16.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:16 compute-1 ceph-mon[79770]: pgmap v782: 337 pgs: 337 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 10:07:16 compute-1 podman[232104]: 2025-12-06 10:07:16.750095996 +0000 UTC m=+0.059443734 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:07:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002180 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:18.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400054a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:18.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:18 compute-1 ceph-mon[79770]: pgmap v783: 337 pgs: 337 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 10:07:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:07:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004810 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:19 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1919256904' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:07:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:20.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002180 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:20.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:20 compute-1 ceph-mon[79770]: pgmap v784: 337 pgs: 337 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1023 B/s wr, 3 op/s
Dec 06 10:07:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400054c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:22.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004830 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:22.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:22 compute-1 ceph-mon[79770]: pgmap v785: 337 pgs: 337 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 170 B/s wr, 0 op/s
Dec 06 10:07:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:07:23 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4093221170' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.912094) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015643912321, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1207, "num_deletes": 251, "total_data_size": 2821937, "memory_usage": 2847248, "flush_reason": "Manual Compaction"}
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015643925331, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1820143, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24769, "largest_seqno": 25971, "table_properties": {"data_size": 1815003, "index_size": 2600, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11653, "raw_average_key_size": 19, "raw_value_size": 1804422, "raw_average_value_size": 3084, "num_data_blocks": 116, "num_entries": 585, "num_filter_entries": 585, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015553, "oldest_key_time": 1765015553, "file_creation_time": 1765015643, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 13293 microseconds, and 5530 cpu microseconds.
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.925405) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1820143 bytes OK
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.925438) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.927446) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.927487) EVENT_LOG_v1 {"time_micros": 1765015643927477, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.927512) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 2816081, prev total WAL file size 2816081, number of live WAL files 2.
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.928507) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1777KB)], [48(13MB)]
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015643928678, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 15534945, "oldest_snapshot_seqno": -1}
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5433 keys, 13334205 bytes, temperature: kUnknown
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015643991238, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 13334205, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13297953, "index_size": 21550, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13637, "raw_key_size": 139443, "raw_average_key_size": 25, "raw_value_size": 13199507, "raw_average_value_size": 2429, "num_data_blocks": 875, "num_entries": 5433, "num_filter_entries": 5433, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015643, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.991698) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 13334205 bytes
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.993174) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 247.7 rd, 212.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 13.1 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(15.9) write-amplify(7.3) OK, records in: 5950, records dropped: 517 output_compression: NoCompression
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.993217) EVENT_LOG_v1 {"time_micros": 1765015643993197, "job": 28, "event": "compaction_finished", "compaction_time_micros": 62705, "compaction_time_cpu_micros": 28249, "output_level": 6, "num_output_files": 1, "total_output_size": 13334205, "num_input_records": 5950, "num_output_records": 5433, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015643994217, "job": 28, "event": "table_file_deletion", "file_number": 50}
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:07:23 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015643999696, "job": 28, "event": "table_file_deletion", "file_number": 48}
Dec 06 10:07:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.928292) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:07:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.999780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:07:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.999788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:07:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.999791) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:07:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.999794) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:07:24 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:07:23.999798) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:07:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:24.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400054e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004850 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:24.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:24 compute-1 ceph-mon[79770]: pgmap v786: 337 pgs: 337 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Dec 06 10:07:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:07:24 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/175523667' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:07:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:25 compute-1 ceph-mon[79770]: pgmap v787: 337 pgs: 337 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:07:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:26.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005500 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:26.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004870 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:28.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:28.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:28 compute-1 ceph-mon[79770]: pgmap v788: 337 pgs: 337 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:07:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:07:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:30.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218004890 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:30.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:30 compute-1 ceph-mon[79770]: pgmap v789: 337 pgs: 337 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 465 KiB/s rd, 1.8 MiB/s wr, 52 op/s
Dec 06 10:07:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:31 compute-1 sudo[232132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:07:31 compute-1 sudo[232132]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:07:31 compute-1 sudo[232132]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:32.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005540 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22180048b0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:32.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:32 compute-1 ceph-mon[79770]: pgmap v790: 337 pgs: 337 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 465 KiB/s rd, 1.8 MiB/s wr, 52 op/s
Dec 06 10:07:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:07:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:34.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005560 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:34.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:34 compute-1 ceph-mon[79770]: pgmap v791: 337 pgs: 337 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Dec 06 10:07:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:36.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:36.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:36 compute-1 nova_compute[228576]: 2025-12-06 10:07:36.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:36 compute-1 nova_compute[228576]: 2025-12-06 10:07:36.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:07:36 compute-1 ceph-mon[79770]: pgmap v792: 337 pgs: 337 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Dec 06 10:07:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:38.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:38.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:38 compute-1 ceph-mon[79770]: pgmap v793: 337 pgs: 337 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Dec 06 10:07:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:07:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228002320 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:07:39 compute-1 podman[232161]: 2025-12-06 10:07:39.81174009 +0000 UTC m=+0.101618532 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:07:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:40.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400055a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:40.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:40 compute-1 nova_compute[228576]: 2025-12-06 10:07:40.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:40 compute-1 nova_compute[228576]: 2025-12-06 10:07:40.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:07:40 compute-1 nova_compute[228576]: 2025-12-06 10:07:40.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:07:40 compute-1 nova_compute[228576]: 2025-12-06 10:07:40.487 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:07:40 compute-1 nova_compute[228576]: 2025-12-06 10:07:40.488 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:40 compute-1 ceph-mon[79770]: pgmap v794: 337 pgs: 337 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Dec 06 10:07:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/698280352' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:07:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:41 compute-1 nova_compute[228576]: 2025-12-06 10:07:41.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:41 compute-1 nova_compute[228576]: 2025-12-06 10:07:41.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:41 compute-1 nova_compute[228576]: 2025-12-06 10:07:41.493 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:07:41 compute-1 nova_compute[228576]: 2025-12-06 10:07:41.494 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:07:41 compute-1 nova_compute[228576]: 2025-12-06 10:07:41.494 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:07:41 compute-1 nova_compute[228576]: 2025-12-06 10:07:41.494 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:07:41 compute-1 nova_compute[228576]: 2025-12-06 10:07:41.494 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:07:41 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3639730022' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:07:41 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:07:41 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/656854972' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:07:41 compute-1 nova_compute[228576]: 2025-12-06 10:07:41.935 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:07:41 compute-1 sshd-session[232187]: Invalid user ubuntu from 222.88.225.195 port 44214
Dec 06 10:07:42 compute-1 nova_compute[228576]: 2025-12-06 10:07:42.092 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:07:42 compute-1 nova_compute[228576]: 2025-12-06 10:07:42.094 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5198MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:07:42 compute-1 nova_compute[228576]: 2025-12-06 10:07:42.094 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:07:42 compute-1 nova_compute[228576]: 2025-12-06 10:07:42.095 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:07:42 compute-1 nova_compute[228576]: 2025-12-06 10:07:42.159 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:07:42 compute-1 nova_compute[228576]: 2025-12-06 10:07:42.160 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:07:42 compute-1 nova_compute[228576]: 2025-12-06 10:07:42.175 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:07:42 compute-1 sshd-session[232187]: Received disconnect from 222.88.225.195 port 44214:11:  [preauth]
Dec 06 10:07:42 compute-1 sshd-session[232187]: Disconnected from invalid user ubuntu 222.88.225.195 port 44214 [preauth]
Dec 06 10:07:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:42.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400055c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:42.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:07:42 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4086081083' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:07:42 compute-1 nova_compute[228576]: 2025-12-06 10:07:42.663 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:07:42 compute-1 nova_compute[228576]: 2025-12-06 10:07:42.670 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:07:42 compute-1 nova_compute[228576]: 2025-12-06 10:07:42.697 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:07:42 compute-1 nova_compute[228576]: 2025-12-06 10:07:42.699 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:07:42 compute-1 nova_compute[228576]: 2025-12-06 10:07:42.699 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:07:42 compute-1 ceph-mon[79770]: pgmap v795: 337 pgs: 337 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.5 MiB/s rd, 48 op/s
Dec 06 10:07:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/656854972' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:07:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/4086081083' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:07:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:43 compute-1 nova_compute[228576]: 2025-12-06 10:07:43.700 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:43 compute-1 nova_compute[228576]: 2025-12-06 10:07:43.701 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:43 compute-1 nova_compute[228576]: 2025-12-06 10:07:43.701 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:43 compute-1 nova_compute[228576]: 2025-12-06 10:07:43.701 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:07:43 compute-1 podman[232235]: 2025-12-06 10:07:43.758190381 +0000 UTC m=+0.061535015 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:07:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:07:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:44.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:44.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:44 compute-1 ceph-mon[79770]: pgmap v796: 337 pgs: 337 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.8 MiB/s rd, 2.1 MiB/s wr, 112 op/s
Dec 06 10:07:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/399366570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:07:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3598102245' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:07:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400055e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:46.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:46.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:46 compute-1 ceph-mon[79770]: pgmap v797: 337 pgs: 337 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 06 10:07:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3571976026' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:07:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3571976026' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:07:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:47 compute-1 podman[232258]: 2025-12-06 10:07:47.746406432 +0000 UTC m=+0.055655664 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 06 10:07:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:48.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400055e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:48.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:07:48 compute-1 ceph-mon[79770]: pgmap v798: 337 pgs: 337 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 06 10:07:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:50.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:50.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:50 compute-1 ceph-mon[79770]: pgmap v799: 337 pgs: 337 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 06 10:07:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:51 compute-1 sudo[232282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:07:51 compute-1 sudo[232282]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:07:51 compute-1 sudo[232282]: pam_unix(sudo:session): session closed for user root
Dec 06 10:07:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:52.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:52.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:52 compute-1 sshd-session[232280]: Received disconnect from 8.210.103.161 port 38250:11: Bye Bye [preauth]
Dec 06 10:07:52 compute-1 sshd-session[232280]: Disconnected from authenticating user root 8.210.103.161 port 38250 [preauth]
Dec 06 10:07:52 compute-1 ceph-mon[79770]: pgmap v800: 337 pgs: 337 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 06 10:07:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005640 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:07:53 compute-1 ceph-mon[79770]: pgmap v801: 337 pgs: 337 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Dec 06 10:07:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:07:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:07:54.281 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:07:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:07:54.281 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:07:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:07:54.281 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:07:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:54.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:54.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:56.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005660 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:07:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:56.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:07:56 compute-1 ceph-mon[79770]: pgmap v802: 337 pgs: 337 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 14 KiB/s wr, 0 op/s
Dec 06 10:07:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:07:58.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005680 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:07:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:07:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:07:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:07:58.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:07:58 compute-1 ceph-mon[79770]: pgmap v803: 337 pgs: 337 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 14 KiB/s wr, 0 op/s
Dec 06 10:07:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:07:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:07:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:00.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2228004c80 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:00.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:00 compute-1 ceph-mon[79770]: pgmap v804: 337 pgs: 337 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 15 KiB/s wr, 1 op/s
Dec 06 10:08:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400056a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:02.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:02.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:02 compute-1 ceph-mon[79770]: pgmap v805: 337 pgs: 337 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 3.0 KiB/s wr, 0 op/s
Dec 06 10:08:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:08:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:04.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400056a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:04.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:04 compute-1 ceph-mon[79770]: pgmap v806: 337 pgs: 337 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 8.2 KiB/s rd, 3.0 KiB/s wr, 1 op/s
Dec 06 10:08:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:06.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400056a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:06.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:07 compute-1 sudo[232316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:08:07 compute-1 sudo[232316]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:08:07 compute-1 sudo[232316]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:07 compute-1 sudo[232341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:08:07 compute-1 sudo[232341]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:08:07 compute-1 ceph-mon[79770]: pgmap v807: 337 pgs: 337 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 7.9 KiB/s rd, 1023 B/s wr, 1 op/s
Dec 06 10:08:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:07 compute-1 sudo[232341]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:08 compute-1 ceph-mon[79770]: pgmap v808: 337 pgs: 337 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 7.9 KiB/s rd, 1023 B/s wr, 1 op/s
Dec 06 10:08:08 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:08:08 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:08:08 compute-1 ceph-mon[79770]: pgmap v809: 337 pgs: 337 active+clean; 121 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 9.1 KiB/s rd, 1.2 KiB/s wr, 1 op/s
Dec 06 10:08:08 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:08:08 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:08:08 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:08:08 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:08:08 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:08:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:08.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:08.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:08:09 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2093960658' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:08:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:08:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:10.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:10.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:10 compute-1 ceph-mon[79770]: pgmap v810: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 6.0 KiB/s wr, 33 op/s
Dec 06 10:08:10 compute-1 podman[232399]: 2025-12-06 10:08:10.811751465 +0000 UTC m=+0.106699545 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125)
Dec 06 10:08:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400056a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:12 compute-1 sudo[232425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:08:12 compute-1 sudo[232425]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:08:12 compute-1 sudo[232425]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:12 compute-1 ceph-mon[79770]: pgmap v811: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 6.0 KiB/s wr, 33 op/s
Dec 06 10:08:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c540 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:12.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:12.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:08:14 compute-1 ceph-mon[79770]: pgmap v812: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 6.0 KiB/s wr, 32 op/s
Dec 06 10:08:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400056c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:14.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c560 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:14.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:14 compute-1 podman[232452]: 2025-12-06 10:08:14.773604146 +0000 UTC m=+0.080483622 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:08:14 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:08:14.915 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:08:14 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:08:14.916 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:08:15 compute-1 sudo[232471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:08:15 compute-1 sudo[232471]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:08:15 compute-1 sudo[232471]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:16 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:08:16 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:08:16 compute-1 ceph-mon[79770]: pgmap v813: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 6.0 KiB/s wr, 32 op/s
Dec 06 10:08:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:16.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400056e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:16.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c580 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:17 compute-1 ceph-mon[79770]: pgmap v814: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 6.0 KiB/s wr, 32 op/s
Dec 06 10:08:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:18.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:18.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:18 compute-1 podman[232499]: 2025-12-06 10:08:18.760054283 +0000 UTC m=+0.062608721 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 10:08:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:08:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005790 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:20 compute-1 ceph-mon[79770]: pgmap v815: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 5.2 KiB/s wr, 28 op/s
Dec 06 10:08:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c5a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:20.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c5a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:20.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c5a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:21 compute-1 ceph-mon[79770]: pgmap v816: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:08:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:22.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005860 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:22.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:22 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:08:22.919 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:08:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210000d00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:08:23 compute-1 ceph-mon[79770]: pgmap v817: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:08:23 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:08:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c5c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:24.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:24.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005880 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:26 compute-1 ceph-mon[79770]: pgmap v818: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:08:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c5e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:26.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:26.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224002e50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:27 compute-1 ceph-mon[79770]: pgmap v819: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:08:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400058a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:28.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:08:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:28.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:08:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:08:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c600 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:29 compute-1 ceph-mon[79770]: pgmap v820: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:08:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400058c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:30.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:30.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:31 compute-1 ceph-mon[79770]: pgmap v821: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:08:32 compute-1 sudo[232529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:08:32 compute-1 sudo[232529]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:08:32 compute-1 sudo[232529]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c620 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:08:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:32.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:08:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:32.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:33 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1132300413' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:08:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400058e0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:08:34 compute-1 ceph-mon[79770]: pgmap v822: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:08:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c640 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:34.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:34.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:35 compute-1 ceph-mon[79770]: pgmap v823: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:08:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005900 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005900 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:36.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:36.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/198037829' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:08:37 compute-1 nova_compute[228576]: 2025-12-06 10:08:37.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:37 compute-1 nova_compute[228576]: 2025-12-06 10:08:37.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:08:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c640 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005900 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:38.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:38.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:08:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3332401644' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:08:39 compute-1 ceph-mon[79770]: pgmap v824: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:08:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c660 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:08:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:40.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:08:40 compute-1 nova_compute[228576]: 2025-12-06 10:08:40.465 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:40 compute-1 nova_compute[228576]: 2025-12-06 10:08:40.482 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:08:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:40.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:08:40 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:08:40 compute-1 ceph-mon[79770]: pgmap v825: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:08:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005920 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:41 compute-1 podman[232559]: 2025-12-06 10:08:41.783126831 +0000 UTC m=+0.085820455 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 06 10:08:41 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1219827938' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:08:41 compute-1 ceph-mon[79770]: pgmap v826: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:08:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c680 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:42.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:42 compute-1 nova_compute[228576]: 2025-12-06 10:08:42.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:42 compute-1 nova_compute[228576]: 2025-12-06 10:08:42.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:08:42 compute-1 nova_compute[228576]: 2025-12-06 10:08:42.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:08:42 compute-1 nova_compute[228576]: 2025-12-06 10:08:42.488 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:08:42 compute-1 nova_compute[228576]: 2025-12-06 10:08:42.488 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:42 compute-1 nova_compute[228576]: 2025-12-06 10:08:42.489 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:42 compute-1 nova_compute[228576]: 2025-12-06 10:08:42.516 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:08:42 compute-1 nova_compute[228576]: 2025-12-06 10:08:42.517 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:08:42 compute-1 nova_compute[228576]: 2025-12-06 10:08:42.517 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:08:42 compute-1 nova_compute[228576]: 2025-12-06 10:08:42.517 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:08:42 compute-1 nova_compute[228576]: 2025-12-06 10:08:42.518 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:08:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:42.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:08:42 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3506121761' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:08:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2139792515' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:08:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3506121761' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:08:42 compute-1 nova_compute[228576]: 2025-12-06 10:08:42.984 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:08:43 compute-1 nova_compute[228576]: 2025-12-06 10:08:43.146 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:08:43 compute-1 nova_compute[228576]: 2025-12-06 10:08:43.148 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5209MB free_disk=59.96752166748047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:08:43 compute-1 nova_compute[228576]: 2025-12-06 10:08:43.148 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:08:43 compute-1 nova_compute[228576]: 2025-12-06 10:08:43.148 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:08:43 compute-1 nova_compute[228576]: 2025-12-06 10:08:43.213 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:08:43 compute-1 nova_compute[228576]: 2025-12-06 10:08:43.213 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:08:43 compute-1 nova_compute[228576]: 2025-12-06 10:08:43.244 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:08:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:08:43 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1405804968' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:08:43 compute-1 nova_compute[228576]: 2025-12-06 10:08:43.690 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:08:43 compute-1 nova_compute[228576]: 2025-12-06 10:08:43.695 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:08:43 compute-1 nova_compute[228576]: 2025-12-06 10:08:43.712 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:08:43 compute-1 nova_compute[228576]: 2025-12-06 10:08:43.713 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:08:43 compute-1 nova_compute[228576]: 2025-12-06 10:08:43.714 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.565s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:08:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:08:43 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1405804968' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:08:43 compute-1 ceph-mon[79770]: pgmap v827: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Dec 06 10:08:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005940 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:44.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100844 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:08:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:44.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c6a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:45 compute-1 nova_compute[228576]: 2025-12-06 10:08:45.695 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:45 compute-1 nova_compute[228576]: 2025-12-06 10:08:45.695 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:45 compute-1 nova_compute[228576]: 2025-12-06 10:08:45.695 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:45 compute-1 nova_compute[228576]: 2025-12-06 10:08:45.696 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:08:45 compute-1 podman[232631]: 2025-12-06 10:08:45.766783382 +0000 UTC m=+0.074121706 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent)
Dec 06 10:08:45 compute-1 ceph-mon[79770]: pgmap v828: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 100 op/s
Dec 06 10:08:45 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/4086049669' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:08:45 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/4086049669' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:08:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240005960 fd 50 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:46.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:46.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:48.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:48.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:08:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2148744341' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:08:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1552448253' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:08:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:49 compute-1 podman[232653]: 2025-12-06 10:08:49.784759144 +0000 UTC m=+0.084801200 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:08:50 compute-1 ceph-mon[79770]: pgmap v829: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 100 op/s
Dec 06 10:08:50 compute-1 ceph-mon[79770]: pgmap v830: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 100 op/s
Dec 06 10:08:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c6e0 fd 49 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:50.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:08:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:50.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:08:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:52 compute-1 ceph-mon[79770]: pgmap v831: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Dec 06 10:08:52 compute-1 sudo[232676]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:08:52 compute-1 sudo[232676]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:08:52 compute-1 sudo[232676]: pam_unix(sudo:session): session closed for user root
Dec 06 10:08:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:08:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:52.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:08:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:52.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:08:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:08:53 compute-1 ceph-mon[79770]: pgmap v832: 337 pgs: 337 active+clean; 109 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 2.0 MiB/s wr, 102 op/s
Dec 06 10:08:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:08:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:08:54.282 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:08:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:08:54.282 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:08:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:08:54.282 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:08:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:54.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:08:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:54.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:08:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:56 compute-1 ceph-mon[79770]: pgmap v833: 337 pgs: 337 active+clean; 109 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 155 KiB/s rd, 2.0 MiB/s wr, 29 op/s
Dec 06 10:08:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:08:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:56.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:08:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:56.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:08:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:08:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:08:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:58 compute-1 ceph-mon[79770]: pgmap v834: 337 pgs: 337 active+clean; 109 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 155 KiB/s rd, 2.0 MiB/s wr, 29 op/s
Dec 06 10:08:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:08:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:08:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:08:58.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:08:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:08:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:08:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:08:58.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:08:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:08:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:08:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:00 compute-1 ceph-mon[79770]: pgmap v835: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Dec 06 10:09:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 10:09:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:00.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:00.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:02 compute-1 ceph-mon[79770]: pgmap v836: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Dec 06 10:09:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:09:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:02.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:09:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:09:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:02.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:09:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:09:04 compute-1 ceph-mon[79770]: pgmap v837: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 333 KiB/s rd, 2.1 MiB/s wr, 68 op/s
Dec 06 10:09:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:04.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:09:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:04.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:09:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:06 compute-1 ceph-mon[79770]: pgmap v838: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 178 KiB/s rd, 107 KiB/s wr, 38 op/s
Dec 06 10:09:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:06.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100906 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 10:09:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:06.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:08 compute-1 ceph-mon[79770]: pgmap v839: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 178 KiB/s rd, 107 KiB/s wr, 38 op/s
Dec 06 10:09:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:09:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:08.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:09:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:09:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:08.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:09:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:09:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:09:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:10.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:10.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:11 compute-1 ceph-mon[79770]: pgmap v840: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 178 KiB/s rd, 113 KiB/s wr, 38 op/s
Dec 06 10:09:11 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/465585946' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:09:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:12 compute-1 sudo[232712]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:09:12 compute-1 sudo[232712]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:09:12 compute-1 sudo[232712]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:12.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:12 compute-1 podman[232736]: 2025-12-06 10:09:12.445201289 +0000 UTC m=+0.088915005 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 06 10:09:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:12.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:13 compute-1 ceph-mon[79770]: pgmap v841: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 5.9 KiB/s rd, 17 KiB/s wr, 1 op/s
Dec 06 10:09:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:09:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:14.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:09:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:14.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:09:15 compute-1 ceph-mon[79770]: pgmap v842: 337 pgs: 337 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 1.8 MiB/s wr, 29 op/s
Dec 06 10:09:15 compute-1 sudo[232767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:15 compute-1 sudo[232767]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:09:15 compute-1 sudo[232767]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:15 compute-1 sudo[232792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Dec 06 10:09:15 compute-1 sudo[232792]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:09:15 compute-1 podman[232833]: 2025-12-06 10:09:15.932101928 +0000 UTC m=+0.060518114 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:09:15 compute-1 sudo[232792]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:16 compute-1 sudo[232858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:09:16 compute-1 sudo[232858]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:09:16 compute-1 sudo[232858]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:16 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3189126703' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:09:16 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:09:16 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:09:16 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:09:16 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:09:16 compute-1 sudo[232883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:09:16 compute-1 sudo[232883]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:09:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:09:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:16.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:09:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:09:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:16.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:09:16 compute-1 sudo[232883]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:17 compute-1 ceph-mon[79770]: pgmap v843: 337 pgs: 337 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:09:17 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3347373695' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:09:17 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:09:17 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:09:17 compute-1 ceph-mon[79770]: pgmap v844: 337 pgs: 337 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.0 MiB/s wr, 30 op/s
Dec 06 10:09:17 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:09:17 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:09:17 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:09:17 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:09:17 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:09:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:17 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:09:17.798 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:09:17 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:09:17.801 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:09:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:18.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:18.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:09:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001bf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:19 compute-1 ceph-mon[79770]: pgmap v845: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 27 KiB/s rd, 2.0 MiB/s wr, 41 op/s
Dec 06 10:09:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:20.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:09:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:20.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:09:20 compute-1 podman[232941]: 2025-12-06 10:09:20.746374651 +0000 UTC m=+0.057800537 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd)
Dec 06 10:09:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:21 compute-1 ceph-mon[79770]: pgmap v846: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 27 KiB/s rd, 2.0 MiB/s wr, 41 op/s
Dec 06 10:09:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:09:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:22.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:09:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:09:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:22.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:09:22 compute-1 sudo[232964]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:09:22 compute-1 sudo[232964]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:09:22 compute-1 sudo[232964]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:23 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:09:23 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:09:23 compute-1 ceph-mon[79770]: pgmap v847: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 27 KiB/s rd, 2.0 MiB/s wr, 41 op/s
Dec 06 10:09:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:09:24 compute-1 sshd-session[232764]: ssh_dispatch_run_fatal: Connection from 14.103.118.140 port 46602: Connection timed out [preauth]
Dec 06 10:09:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224004e70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:09:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:24.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:09:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:24.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:09:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:25 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:09:25.803 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:09:25 compute-1 ceph-mon[79770]: pgmap v848: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 14 KiB/s wr, 81 op/s
Dec 06 10:09:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_39] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:26.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:26.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:27 compute-1 ceph-mon[79770]: pgmap v849: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 14 KiB/s wr, 81 op/s
Dec 06 10:09:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:28.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:28.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:09:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:29 compute-1 ceph-mon[79770]: pgmap v850: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 74 op/s
Dec 06 10:09:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:30.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:30.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:30 compute-1 ceph-mon[79770]: pgmap v851: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 2.7 KiB/s wr, 64 op/s
Dec 06 10:09:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c001c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:32 compute-1 sudo[232996]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:09:32 compute-1 sudo[232996]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:09:32 compute-1 sudo[232996]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:32.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:32.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:09:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:34.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:34.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:34 compute-1 ceph-mon[79770]: pgmap v852: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 2.7 KiB/s wr, 64 op/s
Dec 06 10:09:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:09:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 9251 writes, 35K keys, 9251 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s
                                           Cumulative WAL: 9251 writes, 2253 syncs, 4.11 writes per sync, written: 0.03 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1447 writes, 4631 keys, 1447 commit groups, 1.0 writes per commit group, ingest: 5.55 MB, 0.01 MB/s
                                           Interval WAL: 1447 writes, 614 syncs, 2.36 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:09:36 compute-1 ceph-mon[79770]: pgmap v853: 337 pgs: 337 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 128 op/s
Dec 06 10:09:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:36.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:09:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:36.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:09:37 compute-1 ceph-mon[79770]: pgmap v854: 337 pgs: 337 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 06 10:09:37 compute-1 nova_compute[228576]: 2025-12-06 10:09:37.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:37 compute-1 nova_compute[228576]: 2025-12-06 10:09:37.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:09:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:38.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:38.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:09:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003db0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:39 compute-1 ceph-mon[79770]: pgmap v855: 337 pgs: 337 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Dec 06 10:09:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:09:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/100940 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:09:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:40.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:09:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:40.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:09:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:41 compute-1 ceph-mon[79770]: pgmap v856: 337 pgs: 337 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 06 10:09:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218001230 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:09:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:42.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:09:42 compute-1 nova_compute[228576]: 2025-12-06 10:09:42.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:42.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4286393203' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:09:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/979909076' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:09:42 compute-1 podman[233026]: 2025-12-06 10:09:42.777787817 +0000 UTC m=+0.084190518 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:09:43 compute-1 nova_compute[228576]: 2025-12-06 10:09:43.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:43 compute-1 nova_compute[228576]: 2025-12-06 10:09:43.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:09:43 compute-1 nova_compute[228576]: 2025-12-06 10:09:43.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:09:43 compute-1 nova_compute[228576]: 2025-12-06 10:09:43.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:09:43 compute-1 nova_compute[228576]: 2025-12-06 10:09:43.496 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:09:43 compute-1 nova_compute[228576]: 2025-12-06 10:09:43.496 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:09:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:09:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:09:43 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3709469109' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:09:43 compute-1 nova_compute[228576]: 2025-12-06 10:09:43.989 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:09:44 compute-1 nova_compute[228576]: 2025-12-06 10:09:44.152 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:09:44 compute-1 nova_compute[228576]: 2025-12-06 10:09:44.154 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5211MB free_disk=59.89716339111328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:09:44 compute-1 nova_compute[228576]: 2025-12-06 10:09:44.154 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:09:44 compute-1 nova_compute[228576]: 2025-12-06 10:09:44.154 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:09:44 compute-1 ceph-mon[79770]: pgmap v857: 337 pgs: 337 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 06 10:09:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3729956179' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:09:44 compute-1 nova_compute[228576]: 2025-12-06 10:09:44.256 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:09:44 compute-1 nova_compute[228576]: 2025-12-06 10:09:44.256 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:09:44 compute-1 nova_compute[228576]: 2025-12-06 10:09:44.273 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:09:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:09:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:44.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:09:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:44.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:44 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:09:44 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3647347951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:09:44 compute-1 nova_compute[228576]: 2025-12-06 10:09:44.726 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:09:44 compute-1 nova_compute[228576]: 2025-12-06 10:09:44.734 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:09:44 compute-1 nova_compute[228576]: 2025-12-06 10:09:44.766 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:09:44 compute-1 nova_compute[228576]: 2025-12-06 10:09:44.769 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:09:44 compute-1 nova_compute[228576]: 2025-12-06 10:09:44.769 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:09:45 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3709469109' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:09:45 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3647347951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:09:45 compute-1 ceph-mon[79770]: pgmap v858: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 346 KiB/s rd, 2.1 MiB/s wr, 92 op/s
Dec 06 10:09:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:45 compute-1 nova_compute[228576]: 2025-12-06 10:09:45.769 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:45 compute-1 nova_compute[228576]: 2025-12-06 10:09:45.770 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:09:45 compute-1 nova_compute[228576]: 2025-12-06 10:09:45.770 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:09:45 compute-1 nova_compute[228576]: 2025-12-06 10:09:45.784 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:09:45 compute-1 nova_compute[228576]: 2025-12-06 10:09:45.784 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:45 compute-1 nova_compute[228576]: 2025-12-06 10:09:45.784 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:45 compute-1 nova_compute[228576]: 2025-12-06 10:09:45.784 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:45 compute-1 nova_compute[228576]: 2025-12-06 10:09:45.785 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/2942484271' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:09:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/2942484271' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:09:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:46.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:46.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:46 compute-1 podman[233097]: 2025-12-06 10:09:46.751304543 +0000 UTC m=+0.058584217 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 06 10:09:47 compute-1 nova_compute[228576]: 2025-12-06 10:09:47.479 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:09:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:47 compute-1 ceph-mon[79770]: pgmap v859: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 13 KiB/s wr, 28 op/s
Dec 06 10:09:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1962556856' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:09:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:48.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:48.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:09:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:09:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3467548251' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:09:49 compute-1 ceph-mon[79770]: pgmap v860: 337 pgs: 337 active+clean; 48 MiB data, 266 MiB used, 60 GiB / 60 GiB avail; 37 KiB/s rd, 14 KiB/s wr, 53 op/s
Dec 06 10:09:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/279661780' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:09:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:50.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:50.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:51 compute-1 ceph-mon[79770]: pgmap v861: 337 pgs: 337 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 2.4 KiB/s wr, 56 op/s
Dec 06 10:09:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:51 compute-1 podman[233118]: 2025-12-06 10:09:51.752875277 +0000 UTC m=+0.059160291 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Dec 06 10:09:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:09:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:09:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:52.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:52 compute-1 sudo[233140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:09:52 compute-1 sudo[233140]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:09:52 compute-1 sudo[233140]: pam_unix(sudo:session): session closed for user root
Dec 06 10:09:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:52.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:09:53 compute-1 ceph-mon[79770]: pgmap v862: 337 pgs: 337 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 2.4 KiB/s wr, 56 op/s
Dec 06 10:09:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:09:54.283 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:09:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:09:54.283 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:09:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:09:54.284 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:09:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:54.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:54.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:09:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 10:09:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:55 compute-1 ceph-mon[79770]: pgmap v863: 337 pgs: 337 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 40 KiB/s rd, 3.2 KiB/s wr, 58 op/s
Dec 06 10:09:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280030f0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:56.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:56.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:57 compute-1 ceph-mon[79770]: pgmap v864: 337 pgs: 337 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 2.1 KiB/s wr, 30 op/s
Dec 06 10:09:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:09:58.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:09:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:09:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:09:58.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:09:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:09:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:09:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:09:59 compute-1 ceph-mon[79770]: pgmap v865: 337 pgs: 337 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 2.2 KiB/s wr, 31 op/s
Dec 06 10:10:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210003eb0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101000 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 10:10:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:00.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:00.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:00 compute-1 ceph-mon[79770]: Health detail: HEALTH_WARN 1 failed cephadm daemon(s)
Dec 06 10:10:00 compute-1 ceph-mon[79770]: [WRN] CEPHADM_FAILED_DAEMON: 1 failed cephadm daemon(s)
Dec 06 10:10:00 compute-1 ceph-mon[79770]:     daemon nfs.cephfs.2.0.compute-0.dfwxck on compute-0 is in unknown state
Dec 06 10:10:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c004dd0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:02 compute-1 ceph-mon[79770]: pgmap v866: 337 pgs: 337 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 3.5 KiB/s rd, 1.4 KiB/s wr, 5 op/s
Dec 06 10:10:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:02.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:02.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:03 compute-1 ceph-mon[79770]: pgmap v867: 337 pgs: 337 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Dec 06 10:10:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:10:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:04.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:04.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:05 compute-1 ceph-mon[79770]: pgmap v868: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Dec 06 10:10:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:06.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:06.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:07 compute-1 ceph-mon[79770]: pgmap v869: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:10:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:08.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:08.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:10:08 compute-1 ceph-mon[79770]: pgmap v870: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:10:08 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:10:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:10.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:10.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:11 compute-1 ceph-mon[79770]: pgmap v871: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 06 10:10:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:12.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:12 compute-1 sudo[233177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:10:12 compute-1 sudo[233177]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:10:12 compute-1 sudo[233177]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:12 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:10:12 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 5106 writes, 27K keys, 5106 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s
                                           Cumulative WAL: 5106 writes, 5106 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1498 writes, 7288 keys, 1498 commit groups, 1.0 writes per commit group, ingest: 16.90 MB, 0.03 MB/s
                                           Interval WAL: 1499 writes, 1499 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    115.3      0.33              0.15        14    0.024       0      0       0.0       0.0
                                             L6      1/0   12.72 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.4    123.3    107.1      1.58              0.55        13    0.122     67K   6725       0.0       0.0
                                            Sum      1/0   12.72 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   5.4    101.9    108.5      1.92              0.70        27    0.071     67K   6725       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.9    122.3    121.3      0.62              0.19        10    0.062     29K   2587       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    123.3    107.1      1.58              0.55        13    0.122     67K   6725       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    115.9      0.33              0.15        13    0.025       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.037, interval 0.009
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.20 GB write, 0.12 MB/s write, 0.19 GB read, 0.11 MB/s read, 1.9 seconds
                                           Interval compaction: 0.07 GB write, 0.13 MB/s write, 0.07 GB read, 0.13 MB/s read, 0.6 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fbbecff350#2 capacity: 304.00 MB usage: 13.55 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000157 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(720,13.00 MB,4.27649%) FilterBlock(27,201.92 KB,0.0648649%) IndexBlock(27,355.77 KB,0.114285%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 06 10:10:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:12.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:13 compute-1 podman[233202]: 2025-12-06 10:10:13.790397615 +0000 UTC m=+0.100977973 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:10:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:10:13 compute-1 ceph-mon[79770]: pgmap v872: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Dec 06 10:10:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:14.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:14.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:14 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3879393744' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:10:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:15 compute-1 ceph-mon[79770]: pgmap v873: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Dec 06 10:10:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:16.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:16.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:17 compute-1 podman[233230]: 2025-12-06 10:10:17.750352657 +0000 UTC m=+0.060728760 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:10:17 compute-1 ceph-mon[79770]: pgmap v874: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:10:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:18.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:18.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:10:18 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4273770741' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:10:18 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2944879998' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:10:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:20 compute-1 ceph-mon[79770]: pgmap v875: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:10:20 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:10:20.402 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:10:20 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:10:20.403 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:10:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:10:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:20.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:10:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:20.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:21 compute-1 ceph-mon[79770]: pgmap v876: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:10:21 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:10:21.405 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:10:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:22.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:22.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:22 compute-1 podman[233252]: 2025-12-06 10:10:22.768741386 +0000 UTC m=+0.069533917 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd)
Dec 06 10:10:23 compute-1 sudo[233272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:10:23 compute-1 sudo[233272]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:10:23 compute-1 sudo[233272]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:23 compute-1 sudo[233297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:10:23 compute-1 sudo[233297]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:10:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:23 compute-1 sudo[233297]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:23 compute-1 sudo[233353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:10:23 compute-1 sudo[233353]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:10:23 compute-1 sudo[233353]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:23 compute-1 sudo[233378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid 5ecd3f74-dade-5fc4-92ce-8950ae424258 -- inventory --format=json-pretty --filter-for-batch
Dec 06 10:10:23 compute-1 sudo[233378]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:10:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:10:23 compute-1 ceph-mon[79770]: pgmap v877: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:10:23 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:10:23 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:10:24 compute-1 podman[233443]: 2025-12-06 10:10:24.30445373 +0000 UTC m=+0.050048646 container create 22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_brown, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Dec 06 10:10:24 compute-1 systemd[1]: Started libpod-conmon-22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f.scope.
Dec 06 10:10:24 compute-1 podman[233443]: 2025-12-06 10:10:24.281607246 +0000 UTC m=+0.027202172 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 10:10:24 compute-1 systemd[1]: Started libcrun container.
Dec 06 10:10:24 compute-1 podman[233443]: 2025-12-06 10:10:24.414048734 +0000 UTC m=+0.159643690 container init 22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_brown, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 10:10:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:24 compute-1 podman[233443]: 2025-12-06 10:10:24.425401284 +0000 UTC m=+0.170996220 container start 22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_brown, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 06 10:10:24 compute-1 podman[233443]: 2025-12-06 10:10:24.429467345 +0000 UTC m=+0.175062251 container attach 22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_brown, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:10:24 compute-1 zealous_brown[233460]: 167 167
Dec 06 10:10:24 compute-1 systemd[1]: libpod-22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f.scope: Deactivated successfully.
Dec 06 10:10:24 compute-1 podman[233443]: 2025-12-06 10:10:24.434587871 +0000 UTC m=+0.180182757 container died 22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_brown, OSD_FLAVOR=default, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 10:10:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:24 compute-1 systemd[1]: var-lib-containers-storage-overlay-69fc32efcb6bbf208d174d783e70de0b8b71e32cb94b90766027937fb6de5c49-merged.mount: Deactivated successfully.
Dec 06 10:10:24 compute-1 podman[233443]: 2025-12-06 10:10:24.477069129 +0000 UTC m=+0.222664025 container remove 22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_brown, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Dec 06 10:10:24 compute-1 systemd[1]: libpod-conmon-22f9a27de4233a0ac880f52c9cb8e2fc0b2d22c92029a97603240176e72c960f.scope: Deactivated successfully.
Dec 06 10:10:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:24.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:24 compute-1 podman[233484]: 2025-12-06 10:10:24.644951932 +0000 UTC m=+0.049219746 container create 27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_clarke, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Dec 06 10:10:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:24.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:24 compute-1 systemd[1]: Started libpod-conmon-27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8.scope.
Dec 06 10:10:24 compute-1 podman[233484]: 2025-12-06 10:10:24.62097504 +0000 UTC m=+0.025242864 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 10:10:24 compute-1 systemd[1]: Started libcrun container.
Dec 06 10:10:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c41c57e792e340df659c20a30f2c7cf8d8c4bb4de71f57b4504afb46e76cbc6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Dec 06 10:10:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c41c57e792e340df659c20a30f2c7cf8d8c4bb4de71f57b4504afb46e76cbc6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Dec 06 10:10:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c41c57e792e340df659c20a30f2c7cf8d8c4bb4de71f57b4504afb46e76cbc6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:10:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c41c57e792e340df659c20a30f2c7cf8d8c4bb4de71f57b4504afb46e76cbc6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Dec 06 10:10:24 compute-1 podman[233484]: 2025-12-06 10:10:24.757112749 +0000 UTC m=+0.161380573 container init 27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_clarke, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:10:24 compute-1 podman[233484]: 2025-12-06 10:10:24.766502091 +0000 UTC m=+0.170769905 container start 27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_clarke, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Dec 06 10:10:24 compute-1 podman[233484]: 2025-12-06 10:10:24.770182442 +0000 UTC m=+0.174450276 container attach 27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_clarke, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 06 10:10:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:10:25 compute-1 laughing_clarke[233500]: [
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:     {
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:         "available": false,
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:         "being_replaced": false,
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:         "ceph_device_lvm": false,
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:         "device_id": "QEMU_DVD-ROM_QM00001",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:         "lsm_data": {},
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:         "lvs": [],
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:         "path": "/dev/sr0",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:         "rejected_reasons": [
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "Insufficient space (<5GB)",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "Has a FileSystem"
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:         ],
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:         "sys_api": {
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "actuators": null,
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "device_nodes": [
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:                 "sr0"
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             ],
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "devname": "sr0",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "human_readable_size": "482.00 KB",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "id_bus": "ata",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "model": "QEMU DVD-ROM",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "nr_requests": "2",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "parent": "/dev/sr0",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "partitions": {},
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "path": "/dev/sr0",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "removable": "1",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "rev": "2.5+",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "ro": "0",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "rotational": "1",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "sas_address": "",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "sas_device_handle": "",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "scheduler_mode": "mq-deadline",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "sectors": 0,
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "sectorsize": "2048",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "size": 493568.0,
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "support_discard": "2048",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "type": "disk",
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:             "vendor": "QEMU"
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:         }
Dec 06 10:10:25 compute-1 laughing_clarke[233500]:     }
Dec 06 10:10:25 compute-1 laughing_clarke[233500]: ]
Dec 06 10:10:25 compute-1 systemd[1]: libpod-27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8.scope: Deactivated successfully.
Dec 06 10:10:25 compute-1 podman[233484]: 2025-12-06 10:10:25.551391789 +0000 UTC m=+0.955659593 container died 27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_clarke, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default)
Dec 06 10:10:25 compute-1 systemd[1]: var-lib-containers-storage-overlay-5c41c57e792e340df659c20a30f2c7cf8d8c4bb4de71f57b4504afb46e76cbc6-merged.mount: Deactivated successfully.
Dec 06 10:10:25 compute-1 podman[233484]: 2025-12-06 10:10:25.598788508 +0000 UTC m=+1.003056352 container remove 27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=laughing_clarke, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Dec 06 10:10:25 compute-1 systemd[1]: libpod-conmon-27b263c71547c71aba5e2f75e94c35a7a05afaacfe0002019ea7a189384972e8.scope: Deactivated successfully.
Dec 06 10:10:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:25 compute-1 sudo[233378]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:25 compute-1 ceph-mon[79770]: pgmap v878: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Dec 06 10:10:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:10:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:10:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:10:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:26.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:10:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:26.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:10:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:10:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:10:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:10:27 compute-1 ceph-mon[79770]: pgmap v879: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.9 MiB/s wr, 106 op/s
Dec 06 10:10:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:10:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:10:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:10:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:10:27 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:10:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:28.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:28.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:10:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:29 compute-1 ceph-mon[79770]: pgmap v880: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.9 MiB/s wr, 107 op/s
Dec 06 10:10:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800c700 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_41] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:30.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:30.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:31 compute-1 sudo[234642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:10:31 compute-1 sudo[234642]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:10:31 compute-1 sudo[234642]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:31 compute-1 ceph-mon[79770]: pgmap v881: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 13 KiB/s wr, 78 op/s
Dec 06 10:10:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:10:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:10:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002830 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:32.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:32.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:32 compute-1 sudo[234669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:10:32 compute-1 sudo[234669]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:10:32 compute-1 sudo[234669]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_44] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:10:33 compute-1 ceph-mon[79770]: pgmap v882: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 13 KiB/s wr, 78 op/s
Dec 06 10:10:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:34.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:34.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:35 compute-1 ceph-mon[79770]: pgmap v883: 337 pgs: 337 active+clean; 109 MiB data, 287 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.2 MiB/s wr, 113 op/s
Dec 06 10:10:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:36.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:36.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:37 compute-1 ceph-mon[79770]: pgmap v884: 337 pgs: 337 active+clean; 109 MiB data, 287 MiB used, 60 GiB / 60 GiB avail; 200 KiB/s rd, 2.2 MiB/s wr, 35 op/s
Dec 06 10:10:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:38 compute-1 nova_compute[228576]: 2025-12-06 10:10:38.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:38 compute-1 nova_compute[228576]: 2025-12-06 10:10:38.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:10:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:38.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:38.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:10:39 compute-1 ceph-mon[79770]: pgmap v885: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 06 10:10:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:10:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:10:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:40.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:10:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:40.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:41 compute-1 ceph-mon[79770]: pgmap v886: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Dec 06 10:10:41 compute-1 nova_compute[228576]: 2025-12-06 10:10:41.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:41 compute-1 nova_compute[228576]: 2025-12-06 10:10:41.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:10:41 compute-1 nova_compute[228576]: 2025-12-06 10:10:41.489 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:10:41 compute-1 nova_compute[228576]: 2025-12-06 10:10:41.489 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:42 compute-1 nova_compute[228576]: 2025-12-06 10:10:42.499 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:42.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:42.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:43 compute-1 ceph-mon[79770]: pgmap v887: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Dec 06 10:10:43 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/132234717' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:10:43 compute-1 nova_compute[228576]: 2025-12-06 10:10:43.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:43 compute-1 nova_compute[228576]: 2025-12-06 10:10:43.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:10:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:10:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3976439775' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:10:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:44 compute-1 nova_compute[228576]: 2025-12-06 10:10:44.501 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:44 compute-1 nova_compute[228576]: 2025-12-06 10:10:44.526 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:44 compute-1 nova_compute[228576]: 2025-12-06 10:10:44.526 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:10:44 compute-1 nova_compute[228576]: 2025-12-06 10:10:44.526 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:10:44 compute-1 nova_compute[228576]: 2025-12-06 10:10:44.552 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:10:44 compute-1 nova_compute[228576]: 2025-12-06 10:10:44.552 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:44.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:44.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:44 compute-1 podman[234701]: 2025-12-06 10:10:44.822290438 +0000 UTC m=+0.118230618 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:10:45 compute-1 nova_compute[228576]: 2025-12-06 10:10:45.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:45 compute-1 nova_compute[228576]: 2025-12-06 10:10:45.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:45 compute-1 nova_compute[228576]: 2025-12-06 10:10:45.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:45 compute-1 nova_compute[228576]: 2025-12-06 10:10:45.492 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:10:45 compute-1 nova_compute[228576]: 2025-12-06 10:10:45.493 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:10:45 compute-1 nova_compute[228576]: 2025-12-06 10:10:45.493 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:10:45 compute-1 nova_compute[228576]: 2025-12-06 10:10:45.493 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:10:45 compute-1 nova_compute[228576]: 2025-12-06 10:10:45.494 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:10:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:45 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:10:45 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2163291013' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:10:45 compute-1 nova_compute[228576]: 2025-12-06 10:10:45.993 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:10:46 compute-1 nova_compute[228576]: 2025-12-06 10:10:46.170 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:10:46 compute-1 nova_compute[228576]: 2025-12-06 10:10:46.171 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5207MB free_disk=59.94276428222656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:10:46 compute-1 nova_compute[228576]: 2025-12-06 10:10:46.172 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:10:46 compute-1 nova_compute[228576]: 2025-12-06 10:10:46.172 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:10:46 compute-1 nova_compute[228576]: 2025-12-06 10:10:46.328 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:10:46 compute-1 nova_compute[228576]: 2025-12-06 10:10:46.329 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:10:46 compute-1 nova_compute[228576]: 2025-12-06 10:10:46.405 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing inventories for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:10:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:46 compute-1 nova_compute[228576]: 2025-12-06 10:10:46.467 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating ProviderTree inventory for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:10:46 compute-1 nova_compute[228576]: 2025-12-06 10:10:46.468 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating inventory in ProviderTree for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:10:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:46 compute-1 nova_compute[228576]: 2025-12-06 10:10:46.483 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing aggregate associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:10:46 compute-1 nova_compute[228576]: 2025-12-06 10:10:46.513 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing trait associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, traits: COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AESNI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SHA,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_BMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:10:46 compute-1 nova_compute[228576]: 2025-12-06 10:10:46.531 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:10:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:10:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:46.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:10:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:46.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:46 compute-1 ceph-mon[79770]: pgmap v888: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 06 10:10:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:10:47 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2493443694' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:10:47 compute-1 nova_compute[228576]: 2025-12-06 10:10:47.040 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.508s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:10:47 compute-1 nova_compute[228576]: 2025-12-06 10:10:47.045 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:10:47 compute-1 nova_compute[228576]: 2025-12-06 10:10:47.192 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:10:47 compute-1 nova_compute[228576]: 2025-12-06 10:10:47.194 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:10:47 compute-1 nova_compute[228576]: 2025-12-06 10:10:47.194 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.022s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:10:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2163291013' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:10:47 compute-1 ceph-mon[79770]: pgmap v889: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 137 KiB/s rd, 106 KiB/s wr, 31 op/s
Dec 06 10:10:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/755035782' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:10:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/755035782' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:10:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3587187352' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:10:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2493443694' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:10:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2299801742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:10:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:48.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:48.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:48 compute-1 podman[234774]: 2025-12-06 10:10:48.745564925 +0000 UTC m=+0.053337977 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 10:10:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:10:49 compute-1 nova_compute[228576]: 2025-12-06 10:10:49.187 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:49 compute-1 nova_compute[228576]: 2025-12-06 10:10:49.188 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:10:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:49 compute-1 ceph-mon[79770]: pgmap v890: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 138 KiB/s rd, 111 KiB/s wr, 31 op/s
Dec 06 10:10:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:50.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:50.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224003090 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:51 compute-1 ceph-mon[79770]: pgmap v891: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 16 KiB/s wr, 1 op/s
Dec 06 10:10:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_42] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:52.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:52.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:52 compute-1 sudo[234795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:10:52 compute-1 sudo[234795]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:10:52 compute-1 sudo[234795]: pam_unix(sudo:session): session closed for user root
Dec 06 10:10:52 compute-1 podman[234819]: 2025-12-06 10:10:52.877695526 +0000 UTC m=+0.065880407 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:10:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:10:53 compute-1 ceph-mon[79770]: pgmap v892: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 16 KiB/s wr, 1 op/s
Dec 06 10:10:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:10:54.284 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:10:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:10:54.285 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:10:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:10:54.285 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:10:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:54.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:54.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:10:54 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Dec 06 10:10:54 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:54.987366) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:10:54 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Dec 06 10:10:54 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015854987601, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2371, "num_deletes": 251, "total_data_size": 6486955, "memory_usage": 6571544, "flush_reason": "Manual Compaction"}
Dec 06 10:10:54 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015855016882, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 4184524, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25976, "largest_seqno": 28342, "table_properties": {"data_size": 4174854, "index_size": 6100, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20016, "raw_average_key_size": 20, "raw_value_size": 4155559, "raw_average_value_size": 4236, "num_data_blocks": 267, "num_entries": 981, "num_filter_entries": 981, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015644, "oldest_key_time": 1765015644, "file_creation_time": 1765015854, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 29530 microseconds, and 12427 cpu microseconds.
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.016949) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 4184524 bytes OK
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.016975) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.018898) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.018915) EVENT_LOG_v1 {"time_micros": 1765015855018909, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.018934) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6476383, prev total WAL file size 6476383, number of live WAL files 2.
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.020528) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(4086KB)], [51(12MB)]
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015855020668, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 17518729, "oldest_snapshot_seqno": -1}
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5894 keys, 15448376 bytes, temperature: kUnknown
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015855125556, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 15448376, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15407601, "index_size": 24921, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14789, "raw_key_size": 149731, "raw_average_key_size": 25, "raw_value_size": 15299586, "raw_average_value_size": 2595, "num_data_blocks": 1018, "num_entries": 5894, "num_filter_entries": 5894, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015855, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.126197) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 15448376 bytes
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.128561) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 166.8 rd, 147.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.7 +0.0 blob) out(14.7 +0.0 blob), read-write-amplify(7.9) write-amplify(3.7) OK, records in: 6414, records dropped: 520 output_compression: NoCompression
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.128599) EVENT_LOG_v1 {"time_micros": 1765015855128582, "job": 30, "event": "compaction_finished", "compaction_time_micros": 105011, "compaction_time_cpu_micros": 35598, "output_level": 6, "num_output_files": 1, "total_output_size": 15448376, "num_input_records": 6414, "num_output_records": 5894, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015855130358, "job": 30, "event": "table_file_deletion", "file_number": 53}
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015855135233, "job": 30, "event": "table_file_deletion", "file_number": 51}
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.020397) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.135403) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.135414) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.135416) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.135418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:10:55 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:10:55.135419) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:10:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:55 compute-1 ceph-mon[79770]: pgmap v893: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 3.3 KiB/s rd, 20 KiB/s wr, 1 op/s
Dec 06 10:10:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:10:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:56.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:10:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:10:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:56.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:10:57 compute-1 ceph-mon[79770]: pgmap v894: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 8.3 KiB/s wr, 1 op/s
Dec 06 10:10:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:10:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:10:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:10:58.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:10:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:10:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.002000049s ======
Dec 06 10:10:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:10:58.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000049s
Dec 06 10:10:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:10:59 compute-1 ceph-mon[79770]: pgmap v895: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 8.7 KiB/s wr, 1 op/s
Dec 06 10:10:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:10:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005260 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003370 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:11:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:00.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:11:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:00.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:00 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:11:00 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/938827649' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:11:01 compute-1 ceph-mon[79770]: pgmap v896: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 4.0 KiB/s wr, 1 op/s
Dec 06 10:11:01 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/938827649' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:11:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:11:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:02.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:11:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:11:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:02.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:11:03 compute-1 ceph-mon[79770]: pgmap v897: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 4.0 KiB/s wr, 1 op/s
Dec 06 10:11:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005260 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:11:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003390 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003390 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:11:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:04.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:11:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:04.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:05 compute-1 ceph-mon[79770]: pgmap v898: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Dec 06 10:11:05 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/737373580' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:11:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:06 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/301725092' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:11:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005260 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c003390 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:06.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:06.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:07 compute-1 ceph-mon[79770]: pgmap v899: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:11:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005280 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:08.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:08.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:11:09 compute-1 ceph-mon[79770]: pgmap v900: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 38 op/s
Dec 06 10:11:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:11:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004910 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:10.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:11:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:10.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:11:11 compute-1 ceph-mon[79770]: pgmap v901: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 38 op/s
Dec 06 10:11:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240052a0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004910 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:12.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:12.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:12 compute-1 sudo[234853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:11:12 compute-1 sudo[234853]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:11:12 compute-1 sudo[234853]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:13 compute-1 ceph-mon[79770]: pgmap v902: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 38 op/s
Dec 06 10:11:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:11:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240052c0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004910 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:14.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:11:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:14.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:11:15 compute-1 ceph-mon[79770]: pgmap v903: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Dec 06 10:11:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:15 compute-1 podman[234879]: 2025-12-06 10:11:15.795649688 +0000 UTC m=+0.095409516 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Dec 06 10:11:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240052e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:11:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:16.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:11:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:11:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:16.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:11:17 compute-1 ceph-mon[79770]: pgmap v904: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 16 KiB/s wr, 75 op/s
Dec 06 10:11:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004910 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:18.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:18.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:11:19 compute-1 ceph-mon[79770]: pgmap v905: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 19 KiB/s wr, 75 op/s
Dec 06 10:11:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:19 compute-1 podman[234908]: 2025-12-06 10:11:19.738435396 +0000 UTC m=+0.050072637 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Dec 06 10:11:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:11:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:20.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:11:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:20.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:21 compute-1 ceph-mon[79770]: pgmap v906: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 3.3 KiB/s wr, 64 op/s
Dec 06 10:11:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_43] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c004910 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:22.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:11:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:22.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:11:23 compute-1 ceph-mon[79770]: pgmap v907: 337 pgs: 337 active+clean; 167 MiB data, 328 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 3.3 KiB/s wr, 64 op/s
Dec 06 10:11:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:23 compute-1 podman[234929]: 2025-12-06 10:11:23.764809697 +0000 UTC m=+0.064159704 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd)
Dec 06 10:11:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:11:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:11:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005320 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:24.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:11:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:24.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:11:25 compute-1 ceph-mon[79770]: pgmap v908: 337 pgs: 337 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 128 op/s
Dec 06 10:11:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_47] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2224005320 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:26.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:26.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:27 compute-1 nova_compute[228576]: 2025-12-06 10:11:27.481 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:27 compute-1 ceph-mon[79770]: pgmap v909: 337 pgs: 337 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 305 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 06 10:11:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002830 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:28.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:28.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:11:29 compute-1 ceph-mon[79770]: pgmap v910: 337 pgs: 337 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 305 KiB/s rd, 2.2 MiB/s wr, 64 op/s
Dec 06 10:11:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002830 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:30.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:11:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:30.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:11:31 compute-1 ceph-mon[79770]: pgmap v911: 337 pgs: 337 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 305 KiB/s rd, 2.2 MiB/s wr, 64 op/s
Dec 06 10:11:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:31 compute-1 sudo[234956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:11:31 compute-1 sudo[234956]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:11:31 compute-1 sudo[234956]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:31 compute-1 sudo[234981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:11:31 compute-1 sudo[234981]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:11:32 compute-1 sudo[234981]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:11:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:32.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:11:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:32.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:32 compute-1 sudo[235037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:11:32 compute-1 sudo[235037]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:11:32 compute-1 sudo[235037]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:33 compute-1 ceph-mon[79770]: pgmap v912: 337 pgs: 337 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 305 KiB/s rd, 2.2 MiB/s wr, 64 op/s
Dec 06 10:11:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002830 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:11:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:11:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:34.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:11:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:11:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:34.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:11:35 compute-1 ceph-mon[79770]: pgmap v913: 337 pgs: 337 active+clean; 182 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 314 KiB/s rd, 2.2 MiB/s wr, 76 op/s
Dec 06 10:11:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:11:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:11:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:11:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:11:35 compute-1 ceph-mon[79770]: pgmap v914: 337 pgs: 337 active+clean; 182 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 10 KiB/s rd, 19 KiB/s wr, 14 op/s
Dec 06 10:11:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:11:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:11:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:11:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:11:35 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:11:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210002830 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3176469227' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:11:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:11:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:36.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:11:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:36.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:36 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:11:36.847 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:11:36 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:11:36.848 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:11:37 compute-1 ceph-mon[79770]: pgmap v915: 337 pgs: 337 active+clean; 182 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 10 KiB/s rd, 19 KiB/s wr, 14 op/s
Dec 06 10:11:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:37 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:11:37.851 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:11:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:11:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:38.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:11:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:11:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:38.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:11:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:11:39 compute-1 sudo[235065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:11:39 compute-1 sudo[235065]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:11:39 compute-1 sudo[235065]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:39 compute-1 nova_compute[228576]: 2025-12-06 10:11:39.469 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:39 compute-1 nova_compute[228576]: 2025-12-06 10:11:39.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:11:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:39 compute-1 ceph-mon[79770]: pgmap v916: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 4.9 KiB/s wr, 33 op/s
Dec 06 10:11:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:11:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:11:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:11:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:40.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:40.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:41 compute-1 ceph-mon[79770]: pgmap v917: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 4.9 KiB/s wr, 33 op/s
Dec 06 10:11:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:42.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:11:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:42.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:11:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:43 compute-1 ceph-mon[79770]: pgmap v918: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 4.9 KiB/s wr, 33 op/s
Dec 06 10:11:43 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1436422311' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:11:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:11:44 compute-1 nova_compute[228576]: 2025-12-06 10:11:44.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:44.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3230595324' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:11:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:44.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:45 compute-1 nova_compute[228576]: 2025-12-06 10:11:45.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:45 compute-1 nova_compute[228576]: 2025-12-06 10:11:45.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:11:45 compute-1 nova_compute[228576]: 2025-12-06 10:11:45.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:11:45 compute-1 nova_compute[228576]: 2025-12-06 10:11:45.489 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:11:45 compute-1 nova_compute[228576]: 2025-12-06 10:11:45.490 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:45 compute-1 nova_compute[228576]: 2025-12-06 10:11:45.490 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:45 compute-1 nova_compute[228576]: 2025-12-06 10:11:45.490 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:45 compute-1 nova_compute[228576]: 2025-12-06 10:11:45.524 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:11:45 compute-1 nova_compute[228576]: 2025-12-06 10:11:45.525 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:11:45 compute-1 nova_compute[228576]: 2025-12-06 10:11:45.525 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:11:45 compute-1 nova_compute[228576]: 2025-12-06 10:11:45.525 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:11:45 compute-1 nova_compute[228576]: 2025-12-06 10:11:45.525 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:11:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:45 compute-1 ceph-mon[79770]: pgmap v919: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 12 KiB/s rd, 3.0 KiB/s wr, 19 op/s
Dec 06 10:11:45 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:11:45 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/343641707' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:11:45 compute-1 nova_compute[228576]: 2025-12-06 10:11:45.977 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:11:46 compute-1 nova_compute[228576]: 2025-12-06 10:11:46.205 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:11:46 compute-1 nova_compute[228576]: 2025-12-06 10:11:46.206 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5209MB free_disk=59.942543029785156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:11:46 compute-1 nova_compute[228576]: 2025-12-06 10:11:46.207 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:11:46 compute-1 nova_compute[228576]: 2025-12-06 10:11:46.207 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:11:46 compute-1 nova_compute[228576]: 2025-12-06 10:11:46.291 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:11:46 compute-1 nova_compute[228576]: 2025-12-06 10:11:46.291 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:11:46 compute-1 nova_compute[228576]: 2025-12-06 10:11:46.308 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:11:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:46.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:46 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:11:46 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1506613304' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:11:46 compute-1 nova_compute[228576]: 2025-12-06 10:11:46.750 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:11:46 compute-1 nova_compute[228576]: 2025-12-06 10:11:46.758 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:11:46 compute-1 nova_compute[228576]: 2025-12-06 10:11:46.773 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:11:46 compute-1 nova_compute[228576]: 2025-12-06 10:11:46.777 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:11:46 compute-1 nova_compute[228576]: 2025-12-06 10:11:46.777 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:11:46 compute-1 podman[235136]: 2025-12-06 10:11:46.800987616 +0000 UTC m=+0.100362658 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Dec 06 10:11:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:11:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:46.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:11:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/343641707' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:11:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2501168822' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:11:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3014585911' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:11:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3014585911' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:11:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:47 compute-1 ceph-mon[79770]: pgmap v920: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 11 KiB/s rd, 2.6 KiB/s wr, 16 op/s
Dec 06 10:11:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1506613304' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:11:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:11:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:48.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:11:48 compute-1 nova_compute[228576]: 2025-12-06 10:11:48.758 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:48.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:11:49 compute-1 nova_compute[228576]: 2025-12-06 10:11:49.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:49 compute-1 nova_compute[228576]: 2025-12-06 10:11:49.469 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:11:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:49 compute-1 ceph-mon[79770]: pgmap v921: 337 pgs: 337 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 3.7 KiB/s wr, 44 op/s
Dec 06 10:11:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/468755846' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:11:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1105957875' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:11:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:11:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:50.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:11:50 compute-1 podman[235166]: 2025-12-06 10:11:50.742301127 +0000 UTC m=+0.048812355 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:11:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:50.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22280051b0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:51 compute-1 ceph-mon[79770]: pgmap v922: 337 pgs: 337 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Dec 06 10:11:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:11:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:52.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:11:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:52.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:53 compute-1 sudo[235186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:11:53 compute-1 sudo[235186]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:11:53 compute-1 sudo[235186]: pam_unix(sudo:session): session closed for user root
Dec 06 10:11:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:11:53 compute-1 ceph-mon[79770]: pgmap v923: 337 pgs: 337 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Dec 06 10:11:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:11:54.286 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:11:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:11:54.286 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:11:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:11:54.286 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:11:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.002000050s ======
Dec 06 10:11:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:54.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000050s
Dec 06 10:11:54 compute-1 podman[235212]: 2025-12-06 10:11:54.760140388 +0000 UTC m=+0.058979566 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 10:11:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:11:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:54.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:11:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:11:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:56 compute-1 ceph-mon[79770]: pgmap v924: 337 pgs: 337 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 06 10:11:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100035d0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_48] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:56.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:56.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:57 compute-1 ceph-mon[79770]: pgmap v925: 337 pgs: 337 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Dec 06 10:11:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240031c0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:11:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:11:58.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:11:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:11:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:11:58.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:11:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:11:59 compute-1 ceph-mon[79770]: pgmap v926: 337 pgs: 337 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 06 10:11:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:11:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:00.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:00.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:01 compute-1 ceph-mon[79770]: pgmap v927: 337 pgs: 337 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:12:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:02.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:12:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:02.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:12:03 compute-1 ceph-mon[79770]: pgmap v928: 337 pgs: 337 active+clean; 41 MiB data, 261 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:12:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:12:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:04.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:12:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:04.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:12:05 compute-1 ceph-mon[79770]: pgmap v929: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 94 KiB/s rd, 0 B/s wr, 156 op/s
Dec 06 10:12:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240031e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:06.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:06.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:07 compute-1 ceph-mon[79770]: pgmap v930: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 94 KiB/s rd, 0 B/s wr, 156 op/s
Dec 06 10:12:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101207 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:12:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240031e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:08 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2628881591' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:12:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:12:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:08.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:12:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:08.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:12:09 compute-1 ceph-mon[79770]: pgmap v931: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 94 KiB/s rd, 0 B/s wr, 156 op/s
Dec 06 10:12:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:12:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:10.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:10.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:11 compute-1 ceph-mon[79770]: pgmap v932: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 94 KiB/s rd, 0 B/s wr, 156 op/s
Dec 06 10:12:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240031e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:12.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:12.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:13 compute-1 sudo[235241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:12:13 compute-1 sudo[235241]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:12:13 compute-1 sudo[235241]: pam_unix(sudo:session): session closed for user root
Dec 06 10:12:13 compute-1 ceph-mon[79770]: pgmap v933: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 94 KiB/s rd, 0 B/s wr, 156 op/s
Dec 06 10:12:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:12:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240031e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:14 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1421545061' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:12:14 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1184900673' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:12:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:14.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:12:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:14.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:12:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:15 compute-1 ceph-mon[79770]: pgmap v934: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 111 KiB/s rd, 1.8 MiB/s wr, 183 op/s
Dec 06 10:12:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240031e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:12:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:16.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:16.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240004190 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:17 compute-1 ceph-mon[79770]: pgmap v935: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:12:17 compute-1 podman[235268]: 2025-12-06 10:12:17.809294697 +0000 UTC m=+0.107942315 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:12:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:18.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:18.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:12:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:12:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:12:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22240031e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:19 compute-1 ceph-mon[79770]: pgmap v936: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Dec 06 10:12:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:20.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:20.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_51] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:21 compute-1 podman[235296]: 2025-12-06 10:12:21.769248589 +0000 UTC m=+0.069701901 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec 06 10:12:21 compute-1 ceph-mon[79770]: pgmap v937: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Dec 06 10:12:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:22 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 10:12:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:22.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:22.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:23 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:23 compute-1 ceph-mon[79770]: pgmap v938: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Dec 06 10:12:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:12:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:24 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:12:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:24.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:12:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:12:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:24.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:25 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:25 compute-1 podman[235319]: 2025-12-06 10:12:25.81873187 +0000 UTC m=+0.067841275 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:12:25 compute-1 ceph-mon[79770]: pgmap v939: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 104 op/s
Dec 06 10:12:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:26 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:26.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:26.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/453190227' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:12:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:27 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101227 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 10:12:27 compute-1 ceph-mon[79770]: pgmap v940: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 77 op/s
Dec 06 10:12:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:28 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:28.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:28.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:12:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:29 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:29 compute-1 ceph-mon[79770]: pgmap v941: 337 pgs: 337 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 103 op/s
Dec 06 10:12:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c003630 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:30 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:30.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:12:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:30.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:12:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:31 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:31 compute-1 ceph-mon[79770]: pgmap v942: 337 pgs: 337 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.6 KiB/s wr, 28 op/s
Dec 06 10:12:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:32 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:32.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:32.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:33 compute-1 sudo[235343]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:12:33 compute-1 sudo[235343]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:12:33 compute-1 sudo[235343]: pam_unix(sudo:session): session closed for user root
Dec 06 10:12:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:33 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:12:33 compute-1 ceph-mon[79770]: pgmap v943: 337 pgs: 337 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.6 KiB/s wr, 28 op/s
Dec 06 10:12:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:34 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:12:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:34.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:12:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:34.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:35 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:35 compute-1 ceph-mon[79770]: pgmap v944: 337 pgs: 337 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.6 KiB/s wr, 28 op/s
Dec 06 10:12:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1575946514' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:12:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:36 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:36.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:36.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:37 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:38 compute-1 ceph-mon[79770]: pgmap v945: 337 pgs: 337 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Dec 06 10:12:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:38 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:38.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:12:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:38.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:12:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:12:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:12:39 compute-1 sudo[235371]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:12:39 compute-1 sudo[235371]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:12:39 compute-1 sudo[235371]: pam_unix(sudo:session): session closed for user root
Dec 06 10:12:39 compute-1 sudo[235396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:12:39 compute-1 sudo[235396]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:12:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:39 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:40 compute-1 ceph-mon[79770]: pgmap v946: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 36 KiB/s rd, 1.8 MiB/s wr, 53 op/s
Dec 06 10:12:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3480538889' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:12:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3169413128' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:12:40 compute-1 sudo[235396]: pam_unix(sudo:session): session closed for user root
Dec 06 10:12:40 compute-1 sshd-session[235439]: Invalid user  from 165.245.132.69 port 34834
Dec 06 10:12:40 compute-1 nova_compute[228576]: 2025-12-06 10:12:40.472 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:40 compute-1 nova_compute[228576]: 2025-12-06 10:12:40.474 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:12:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:40 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:40.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:40.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:41 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:12:41 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:12:41 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:12:41 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:12:41 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:12:41 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:12:41 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:12:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:41 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:42 compute-1 ceph-mon[79770]: pgmap v947: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:12:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:42 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:42.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:12:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:42.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:12:43 compute-1 ceph-mon[79770]: pgmap v948: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:12:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:43 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:12:44 compute-1 nova_compute[228576]: 2025-12-06 10:12:44.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:44 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:44.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:44 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:12:44.897 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:12:44 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:12:44.898 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:12:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:44.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:45 compute-1 ceph-mon[79770]: pgmap v949: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.8 MiB/s wr, 103 op/s
Dec 06 10:12:45 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3197267841' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:12:45 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2219237189' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:12:45 compute-1 nova_compute[228576]: 2025-12-06 10:12:45.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:45 compute-1 nova_compute[228576]: 2025-12-06 10:12:45.494 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:12:45 compute-1 nova_compute[228576]: 2025-12-06 10:12:45.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:12:45 compute-1 nova_compute[228576]: 2025-12-06 10:12:45.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:12:45 compute-1 nova_compute[228576]: 2025-12-06 10:12:45.495 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:12:45 compute-1 nova_compute[228576]: 2025-12-06 10:12:45.495 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:12:45 compute-1 sudo[235476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:12:45 compute-1 sudo[235476]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:12:45 compute-1 sudo[235476]: pam_unix(sudo:session): session closed for user root
Dec 06 10:12:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:45 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2218002df0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:45 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:12:45 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2886513219' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:12:45 compute-1 nova_compute[228576]: 2025-12-06 10:12:45.943 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:12:46 compute-1 nova_compute[228576]: 2025-12-06 10:12:46.113 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:12:46 compute-1 nova_compute[228576]: 2025-12-06 10:12:46.115 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5214MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:12:46 compute-1 nova_compute[228576]: 2025-12-06 10:12:46.115 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:12:46 compute-1 nova_compute[228576]: 2025-12-06 10:12:46.115 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:12:46 compute-1 nova_compute[228576]: 2025-12-06 10:12:46.223 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:12:46 compute-1 nova_compute[228576]: 2025-12-06 10:12:46.223 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:12:46 compute-1 nova_compute[228576]: 2025-12-06 10:12:46.240 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:12:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:46 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:12:46 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:12:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2886513219' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:12:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2784791036' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:12:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3659217674' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:12:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3659217674' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:12:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:46 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:46 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:12:46 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1462413702' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:12:46 compute-1 nova_compute[228576]: 2025-12-06 10:12:46.682 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:12:46 compute-1 nova_compute[228576]: 2025-12-06 10:12:46.688 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:12:46 compute-1 nova_compute[228576]: 2025-12-06 10:12:46.701 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:12:46 compute-1 nova_compute[228576]: 2025-12-06 10:12:46.702 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:12:46 compute-1 nova_compute[228576]: 2025-12-06 10:12:46.703 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:12:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:46.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:46.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:47 compute-1 ceph-mon[79770]: pgmap v950: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.8 MiB/s wr, 103 op/s
Dec 06 10:12:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1462413702' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:12:47 compute-1 nova_compute[228576]: 2025-12-06 10:12:47.703 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:47 compute-1 nova_compute[228576]: 2025-12-06 10:12:47.730 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:47 compute-1 nova_compute[228576]: 2025-12-06 10:12:47.731 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:12:47 compute-1 nova_compute[228576]: 2025-12-06 10:12:47.731 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:12:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:47 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:47 compute-1 nova_compute[228576]: 2025-12-06 10:12:47.748 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:12:47 compute-1 nova_compute[228576]: 2025-12-06 10:12:47.748 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:47 compute-1 nova_compute[228576]: 2025-12-06 10:12:47.749 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:47 compute-1 nova_compute[228576]: 2025-12-06 10:12:47.749 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:47 compute-1 sshd-session[235439]: Connection closed by invalid user  165.245.132.69 port 34834 [preauth]
Dec 06 10:12:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:48 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:48.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:48 compute-1 podman[235527]: 2025-12-06 10:12:48.78361543 +0000 UTC m=+0.087163102 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=ovn_controller, managed_by=edpm_ansible)
Dec 06 10:12:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:12:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:12:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:48.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:12:49 compute-1 ceph-mon[79770]: pgmap v951: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.8 MiB/s wr, 130 op/s
Dec 06 10:12:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:49 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.821974) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969822229, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1421, "num_deletes": 255, "total_data_size": 3384085, "memory_usage": 3439744, "flush_reason": "Manual Compaction"}
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969836937, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 2208581, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28347, "largest_seqno": 29763, "table_properties": {"data_size": 2202622, "index_size": 3222, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 12816, "raw_average_key_size": 19, "raw_value_size": 2190437, "raw_average_value_size": 3308, "num_data_blocks": 142, "num_entries": 662, "num_filter_entries": 662, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015855, "oldest_key_time": 1765015855, "file_creation_time": 1765015969, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 15023 microseconds, and 6050 cpu microseconds.
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.837015) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 2208581 bytes OK
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.837044) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.838843) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.838858) EVENT_LOG_v1 {"time_micros": 1765015969838854, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.838877) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 3377363, prev total WAL file size 3377363, number of live WAL files 2.
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.839956) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353033' seq:72057594037927935, type:22 .. '6C6F676D00373534' seq:0, type:0; will stop at (end)
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2156KB)], [54(14MB)]
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969840098, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 17656957, "oldest_snapshot_seqno": -1}
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 6030 keys, 17525192 bytes, temperature: kUnknown
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969932427, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 17525192, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17481301, "index_size": 27717, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15109, "raw_key_size": 153754, "raw_average_key_size": 25, "raw_value_size": 17368698, "raw_average_value_size": 2880, "num_data_blocks": 1135, "num_entries": 6030, "num_filter_entries": 6030, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765015969, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.932719) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 17525192 bytes
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.934061) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 191.0 rd, 189.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 14.7 +0.0 blob) out(16.7 +0.0 blob), read-write-amplify(15.9) write-amplify(7.9) OK, records in: 6556, records dropped: 526 output_compression: NoCompression
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.934133) EVENT_LOG_v1 {"time_micros": 1765015969934124, "job": 32, "event": "compaction_finished", "compaction_time_micros": 92449, "compaction_time_cpu_micros": 44443, "output_level": 6, "num_output_files": 1, "total_output_size": 17525192, "num_input_records": 6556, "num_output_records": 6030, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969934749, "job": 32, "event": "table_file_deletion", "file_number": 56}
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765015969937718, "job": 32, "event": "table_file_deletion", "file_number": 54}
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.839807) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.937772) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.937776) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.937777) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.937779) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:12:49 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:12:49.937780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:12:50 compute-1 nova_compute[228576]: 2025-12-06 10:12:50.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:50 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:12:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:50.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:12:50 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4068013306' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:12:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:12:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:50.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:12:51 compute-1 nova_compute[228576]: 2025-12-06 10:12:51.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:12:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:51 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:51 compute-1 ceph-mon[79770]: pgmap v952: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 14 KiB/s wr, 102 op/s
Dec 06 10:12:51 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/219018723' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:12:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f223c002690 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:52 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_52] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:52.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:52 compute-1 podman[235557]: 2025-12-06 10:12:52.767920642 +0000 UTC m=+0.063876067 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 06 10:12:52 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:12:52.900 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:12:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:52.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:53 compute-1 sudo[235576]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:12:53 compute-1 sudo[235576]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:12:53 compute-1 sudo[235576]: pam_unix(sudo:session): session closed for user root
Dec 06 10:12:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:53 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:53 compute-1 ceph-mon[79770]: pgmap v953: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 100 op/s
Dec 06 10:12:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:12:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:12:54.287 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:12:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:12:54.288 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:12:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:12:54.288 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:12:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:54 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001230 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:54.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:12:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:54.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:55 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:55 compute-1 ceph-mon[79770]: pgmap v954: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 14 KiB/s wr, 100 op/s
Dec 06 10:12:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:56 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:56 compute-1 podman[235605]: 2025-12-06 10:12:56.746122355 +0000 UTC m=+0.058257169 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:12:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:56.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:56.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:57 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001230 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:57 compute-1 ceph-mon[79770]: pgmap v955: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Dec 06 10:12:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:58 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:12:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:12:58.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:12:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:12:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:12:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:12:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:12:58.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:12:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:12:59 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:12:59 compute-1 ceph-mon[79770]: pgmap v956: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Dec 06 10:13:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:00 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001230 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:00.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:00.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:01 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001230 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:02 compute-1 ceph-mon[79770]: pgmap v957: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:13:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:02 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.003000074s ======
Dec 06 10:13:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:02.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000074s
Dec 06 10:13:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:13:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:02.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:13:03 compute-1 ceph-mon[79770]: pgmap v958: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:13:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:03 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:13:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:04 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:04.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:04.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:05 compute-1 ceph-mon[79770]: pgmap v959: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 10:13:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:05 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:06 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001230 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:06.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:13:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:06.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:13:07 compute-1 ceph-mon[79770]: pgmap v960: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:13:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:07 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006180 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:08 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:08.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:13:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:08.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:09 compute-1 ceph-mon[79770]: pgmap v961: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 10:13:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:13:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:09 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2210001230 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:10 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3673570831' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:13:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:10 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400061a0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:10.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:10.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:11 compute-1 ceph-mon[79770]: pgmap v962: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:13:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:11 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100042c0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:12 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:12.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:12.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:13 compute-1 ceph-mon[79770]: pgmap v963: 337 pgs: 337 active+clean; 41 MiB data, 260 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:13:13 compute-1 sudo[235633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:13:13 compute-1 sudo[235633]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:13:13 compute-1 sudo[235633]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:13 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400061c0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:13:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:14 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100042c0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:14.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:14.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:15 compute-1 ceph-mon[79770]: pgmap v964: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:13:15 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1883968166' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:13:15 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/840050543' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:13:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:15 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22400061e0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:16 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:16.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:16.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:17 compute-1 ceph-mon[79770]: pgmap v965: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:13:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:17 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100042c0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:18 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006200 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:18.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:13:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:13:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:18.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:13:19 compute-1 ceph-mon[79770]: pgmap v966: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.3 MiB/s rd, 1.8 MiB/s wr, 81 op/s
Dec 06 10:13:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:19 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_49] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221800cee0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:19 compute-1 podman[235661]: 2025-12-06 10:13:19.803658111 +0000 UTC m=+0.098727923 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:13:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_50] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f22100042c0 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:20 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_54] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f221c005300 fd 28 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:20.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:13:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:20.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:13:21 compute-1 ceph-mon[79770]: pgmap v967: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.3 MiB/s rd, 1.8 MiB/s wr, 81 op/s
Dec 06 10:13:21 compute-1 kernel: ganesha.nfsd[234951]: segfault at 50 ip 00007f22f486932e sp 00007f22b1ffa210 error 4 in libntirpc.so.5.8[7f22f484e000+2c000] likely on CPU 1 (core 0, socket 1)
Dec 06 10:13:21 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 06 10:13:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[229481]: 06/12/2025 10:13:21 : epoch 6933ff06 : compute-1 : ganesha.nfsd-2[svc_45] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f2240006220 fd 28 proxy ignored for local
Dec 06 10:13:21 compute-1 systemd[1]: Started Process Core Dump (PID 235689/UID 0).
Dec 06 10:13:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:22.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:22.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:23 compute-1 ceph-mon[79770]: pgmap v968: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.3 MiB/s rd, 1.8 MiB/s wr, 81 op/s
Dec 06 10:13:23 compute-1 podman[235692]: 2025-12-06 10:13:23.775532537 +0000 UTC m=+0.073874157 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 06 10:13:23 compute-1 systemd-coredump[235690]: Process 229485 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 94:
                                                    #0  0x00007f22f486932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 06 10:13:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:13:23 compute-1 systemd[1]: systemd-coredump@8-235689-0.service: Deactivated successfully.
Dec 06 10:13:23 compute-1 systemd[1]: systemd-coredump@8-235689-0.service: Consumed 2.060s CPU time.
Dec 06 10:13:24 compute-1 podman[235713]: 2025-12-06 10:13:24.024022732 +0000 UTC m=+0.030967726 container died cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 10:13:24 compute-1 systemd[1]: var-lib-containers-storage-overlay-ce8c7da4624aa519272ef2c8bd30d12c947da67ff2923b4958fe16726ed31e84-merged.mount: Deactivated successfully.
Dec 06 10:13:24 compute-1 podman[235713]: 2025-12-06 10:13:24.065342094 +0000 UTC m=+0.072287048 container remove cfd84277d1dcac04a876f3b0ccbf223dd9196bdf0059805be5855adee48962d9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 10:13:24 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec 06 10:13:24 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec 06 10:13:24 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 3.064s CPU time.
Dec 06 10:13:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:13:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:24.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:13:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:24.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:13:25 compute-1 ceph-mon[79770]: pgmap v969: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Dec 06 10:13:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:26.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:26.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:27 compute-1 ceph-mon[79770]: pgmap v970: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Dec 06 10:13:27 compute-1 podman[235759]: 2025-12-06 10:13:27.767475591 +0000 UTC m=+0.064947487 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 06 10:13:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101328 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:13:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:28.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:13:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:13:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:28.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:13:29 compute-1 ceph-mon[79770]: pgmap v971: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Dec 06 10:13:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:30.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:13:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:30.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:13:31 compute-1 ceph-mon[79770]: pgmap v972: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 634 KiB/s rd, 20 op/s
Dec 06 10:13:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:13:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:32.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:13:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:32.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:33 compute-1 ceph-mon[79770]: pgmap v973: 337 pgs: 337 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 634 KiB/s rd, 20 op/s
Dec 06 10:13:33 compute-1 sudo[235782]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:13:33 compute-1 sudo[235782]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:13:33 compute-1 sudo[235782]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101333 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:13:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:13:34 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 9.
Dec 06 10:13:34 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 10:13:34 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 3.064s CPU time.
Dec 06 10:13:34 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 10:13:34 compute-1 podman[235856]: 2025-12-06 10:13:34.664906951 +0000 UTC m=+0.048159391 container create 2d93c4a34df0fb0a855605f5ca927eca7d3f452dbc047710bdbb64fd976c80b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 10:13:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f75aa19a07c84791431fa0a6498e04afd32b24ddc88ad13037871505e8e2ffaf/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 06 10:13:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f75aa19a07c84791431fa0a6498e04afd32b24ddc88ad13037871505e8e2ffaf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:13:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f75aa19a07c84791431fa0a6498e04afd32b24ddc88ad13037871505e8e2ffaf/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 10:13:34 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f75aa19a07c84791431fa0a6498e04afd32b24ddc88ad13037871505e8e2ffaf/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 10:13:34 compute-1 podman[235856]: 2025-12-06 10:13:34.736701877 +0000 UTC m=+0.119954327 container init 2d93c4a34df0fb0a855605f5ca927eca7d3f452dbc047710bdbb64fd976c80b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:13:34 compute-1 podman[235856]: 2025-12-06 10:13:34.644761353 +0000 UTC m=+0.028013843 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 10:13:34 compute-1 podman[235856]: 2025-12-06 10:13:34.74168192 +0000 UTC m=+0.124934360 container start 2d93c4a34df0fb0a855605f5ca927eca7d3f452dbc047710bdbb64fd976c80b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Dec 06 10:13:34 compute-1 bash[235856]: 2d93c4a34df0fb0a855605f5ca927eca7d3f452dbc047710bdbb64fd976c80b1
Dec 06 10:13:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 06 10:13:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 06 10:13:34 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 10:13:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 06 10:13:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 06 10:13:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 06 10:13:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 06 10:13:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 06 10:13:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:34 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:13:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:13:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:34.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:13:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:13:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:34.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:13:35 compute-1 ceph-mon[79770]: pgmap v974: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 935 KiB/s rd, 2.1 MiB/s wr, 82 op/s
Dec 06 10:13:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:36.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:13:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:36.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:13:37 compute-1 sshd-session[235914]: banner exchange: Connection from 3.131.215.38 port 55714: invalid format
Dec 06 10:13:37 compute-1 ceph-mon[79770]: pgmap v975: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 305 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Dec 06 10:13:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:13:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:38.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:13:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:13:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:38.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:39 compute-1 ceph-mon[79770]: pgmap v976: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 306 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Dec 06 10:13:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:13:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:13:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:40.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:13:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:40 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:13:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:40 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:13:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:40.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:41 compute-1 ceph-mon[79770]: pgmap v977: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 303 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Dec 06 10:13:42 compute-1 nova_compute[228576]: 2025-12-06 10:13:42.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:42 compute-1 nova_compute[228576]: 2025-12-06 10:13:42.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:13:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:42.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:42.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:43 compute-1 ceph-mon[79770]: pgmap v978: 337 pgs: 337 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 303 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Dec 06 10:13:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:13:44 compute-1 nova_compute[228576]: 2025-12-06 10:13:44.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2410722622' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:13:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1801167825' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:13:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:13:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:44.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:13:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:13:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:44.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:13:45 compute-1 ceph-mon[79770]: pgmap v979: 337 pgs: 337 active+clean; 41 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 319 KiB/s rd, 2.1 MiB/s wr, 86 op/s
Dec 06 10:13:45 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2469516180' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:13:45 compute-1 sudo[235919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:13:45 compute-1 sudo[235919]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:13:45 compute-1 sudo[235919]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:46 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:13:46.013 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:13:46 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:13:46.015 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:13:46 compute-1 sudo[235944]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:13:46 compute-1 sudo[235944]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:13:46 compute-1 sudo[235944]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:46 compute-1 sudo[236001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:13:46 compute-1 sudo[236001]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:13:46 compute-1 sudo[236001]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:46 compute-1 sudo[236026]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Dec 06 10:13:46 compute-1 sudo[236026]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:13:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:46.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 10:13:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:46 : epoch 693401ce : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:13:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3717895380' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:13:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3717895380' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:13:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:13:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:46.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:13:47 compute-1 sudo[236026]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:47 compute-1 nova_compute[228576]: 2025-12-06 10:13:47.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:47 compute-1 nova_compute[228576]: 2025-12-06 10:13:47.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:47 compute-1 nova_compute[228576]: 2025-12-06 10:13:47.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:47 compute-1 nova_compute[228576]: 2025-12-06 10:13:47.490 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:13:47 compute-1 nova_compute[228576]: 2025-12-06 10:13:47.491 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:13:47 compute-1 nova_compute[228576]: 2025-12-06 10:13:47.491 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:13:47 compute-1 nova_compute[228576]: 2025-12-06 10:13:47.491 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:13:47 compute-1 nova_compute[228576]: 2025-12-06 10:13:47.491 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:13:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:47 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bd4000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:47 compute-1 ceph-mon[79770]: pgmap v980: 337 pgs: 337 active+clean; 41 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 19 KiB/s wr, 25 op/s
Dec 06 10:13:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:13:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:13:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:13:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:13:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:13:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:13:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:13:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:13:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:13:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:13:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:13:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:13:47 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:13:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:13:47 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4031114062' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:13:47 compute-1 nova_compute[228576]: 2025-12-06 10:13:47.967 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:13:48 compute-1 nova_compute[228576]: 2025-12-06 10:13:48.157 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:13:48 compute-1 nova_compute[228576]: 2025-12-06 10:13:48.158 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5219MB free_disk=59.98813247680664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:13:48 compute-1 nova_compute[228576]: 2025-12-06 10:13:48.159 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:13:48 compute-1 nova_compute[228576]: 2025-12-06 10:13:48.159 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:13:48 compute-1 nova_compute[228576]: 2025-12-06 10:13:48.224 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:13:48 compute-1 nova_compute[228576]: 2025-12-06 10:13:48.225 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:13:48 compute-1 nova_compute[228576]: 2025-12-06 10:13:48.238 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:13:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:48 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:48 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb0000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:13:48 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2281509267' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:13:48 compute-1 nova_compute[228576]: 2025-12-06 10:13:48.700 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:13:48 compute-1 nova_compute[228576]: 2025-12-06 10:13:48.708 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:13:48 compute-1 nova_compute[228576]: 2025-12-06 10:13:48.728 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:13:48 compute-1 nova_compute[228576]: 2025-12-06 10:13:48.731 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:13:48 compute-1 nova_compute[228576]: 2025-12-06 10:13:48.731 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:13:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:48.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:13:48 compute-1 ceph-mon[79770]: pgmap v981: 337 pgs: 337 active+clean; 41 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 21 KiB/s wr, 27 op/s
Dec 06 10:13:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/4031114062' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:13:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2281509267' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:13:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:48.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:49 compute-1 nova_compute[228576]: 2025-12-06 10:13:49.732 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:49 compute-1 nova_compute[228576]: 2025-12-06 10:13:49.733 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:13:49 compute-1 nova_compute[228576]: 2025-12-06 10:13:49.733 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:13:49 compute-1 nova_compute[228576]: 2025-12-06 10:13:49.752 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:13:49 compute-1 nova_compute[228576]: 2025-12-06 10:13:49.752 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:49 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:49 : epoch 693401ce : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:13:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:49 : epoch 693401ce : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:13:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101350 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 10:13:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:50 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:50 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:50.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:50 compute-1 podman[236131]: 2025-12-06 10:13:50.858793276 +0000 UTC m=+0.155364343 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:13:50 compute-1 ceph-mon[79770]: pgmap v982: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 9.7 KiB/s wr, 35 op/s
Dec 06 10:13:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:50.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:51 compute-1 nova_compute[228576]: 2025-12-06 10:13:51.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:51 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:51 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2027604282' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:13:51 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2774251911' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:13:52 compute-1 sshd-session[236000]: Connection closed by 3.131.215.38 port 54876
Dec 06 10:13:52 compute-1 sudo[236158]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:13:52 compute-1 sudo[236158]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:13:52 compute-1 sudo[236158]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:52 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:52 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:52 compute-1 sshd-session[236106]: Connection closed by 3.131.215.38 port 54888
Dec 06 10:13:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:52.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:52 : epoch 693401ce : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 10:13:52 compute-1 ceph-mon[79770]: pgmap v983: 337 pgs: 337 active+clean; 41 MiB data, 274 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 9.7 KiB/s wr, 35 op/s
Dec 06 10:13:52 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:13:52 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:13:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:52.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:53 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:13:53.017 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:13:53 compute-1 nova_compute[228576]: 2025-12-06 10:13:53.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:13:53 compute-1 sudo[236183]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:13:53 compute-1 sudo[236183]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:13:53 compute-1 sudo[236183]: pam_unix(sudo:session): session closed for user root
Dec 06 10:13:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:53 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:13:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:13:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:13:54.288 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:13:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:13:54.289 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:13:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:13:54.289 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:13:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:54 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:54 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:54 compute-1 podman[236209]: 2025-12-06 10:13:54.750428669 +0000 UTC m=+0.058724393 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, tcib_managed=true)
Dec 06 10:13:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:13:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:54.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:13:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:13:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:54.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:13:55 compute-1 ceph-mon[79770]: pgmap v984: 337 pgs: 337 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 10 KiB/s wr, 36 op/s
Dec 06 10:13:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:55 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101355 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 10:13:56 compute-1 ceph-mon[79770]: pgmap v985: 337 pgs: 337 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 7.2 KiB/s rd, 1.7 KiB/s wr, 10 op/s
Dec 06 10:13:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:56 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc8001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:56 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb00016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:56.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:56.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:57 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:58 compute-1 ceph-mon[79770]: pgmap v986: 337 pgs: 337 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 7.2 KiB/s rd, 1.7 KiB/s wr, 10 op/s
Dec 06 10:13:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:58 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:58 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:13:58 compute-1 podman[236229]: 2025-12-06 10:13:58.828287647 +0000 UTC m=+0.084615833 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125)
Dec 06 10:13:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:13:58.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:13:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:13:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:13:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:13:59.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:13:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:13:59 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:00 compute-1 ceph-mon[79770]: pgmap v987: 337 pgs: 337 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 6.7 KiB/s rd, 1.6 KiB/s wr, 10 op/s
Dec 06 10:14:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:00 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:00 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:00.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:01.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:01 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:02 compute-1 ceph-mon[79770]: pgmap v988: 337 pgs: 337 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 255 B/s wr, 1 op/s
Dec 06 10:14:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:02 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:02 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:02.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:03.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:03 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:14:04 compute-1 ceph-mon[79770]: pgmap v989: 337 pgs: 337 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 255 B/s wr, 1 op/s
Dec 06 10:14:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:04 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc80030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:04 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb0002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:14:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:04.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:14:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:14:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:05.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:14:05 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1452760283' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:14:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:05 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:06 compute-1 ceph-mon[79770]: pgmap v990: 337 pgs: 337 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Dec 06 10:14:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:06 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8003b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:06 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc80030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:06.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:07.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:07 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc80030f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:08 compute-1 ceph-mon[79770]: pgmap v991: 337 pgs: 337 active+clean; 41 MiB data, 264 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Dec 06 10:14:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:08 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bac003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:08 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb8003b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:08.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:14:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:09.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:14:09 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2607869369' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:14:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:09 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bb0003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:10 compute-1 ceph-mon[79770]: pgmap v992: 337 pgs: 337 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:14:10 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/541364610' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:14:10 compute-1 kernel: ganesha.nfsd[236054]: segfault at 50 ip 00007f1c8425e32e sp 00007f1c527fb210 error 4 in libntirpc.so.5.8[7f1c84243000+2c000] likely on CPU 0 (core 0, socket 0)
Dec 06 10:14:10 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 06 10:14:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[235871]: 06/12/2025 10:14:10 : epoch 693401ce : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f1bc80030f0 fd 39 proxy ignored for local
Dec 06 10:14:10 compute-1 systemd[1]: Started Process Core Dump (PID 236254/UID 0).
Dec 06 10:14:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:10.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:11.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:11 compute-1 systemd-coredump[236255]: Process 235875 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 43:
                                                    #0  0x00007f1c8425e32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 06 10:14:12 compute-1 systemd[1]: systemd-coredump@9-236254-0.service: Deactivated successfully.
Dec 06 10:14:12 compute-1 systemd[1]: systemd-coredump@9-236254-0.service: Consumed 1.390s CPU time.
Dec 06 10:14:12 compute-1 podman[236260]: 2025-12-06 10:14:12.106221375 +0000 UTC m=+0.027942192 container died 2d93c4a34df0fb0a855605f5ca927eca7d3f452dbc047710bdbb64fd976c80b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:14:12 compute-1 systemd[1]: var-lib-containers-storage-overlay-f75aa19a07c84791431fa0a6498e04afd32b24ddc88ad13037871505e8e2ffaf-merged.mount: Deactivated successfully.
Dec 06 10:14:12 compute-1 podman[236260]: 2025-12-06 10:14:12.150025588 +0000 UTC m=+0.071746365 container remove 2d93c4a34df0fb0a855605f5ca927eca7d3f452dbc047710bdbb64fd976c80b1 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Dec 06 10:14:12 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec 06 10:14:12 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec 06 10:14:12 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.743s CPU time.
Dec 06 10:14:12 compute-1 ceph-mon[79770]: pgmap v993: 337 pgs: 337 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:14:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:12.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:13.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:13 compute-1 sudo[236304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:14:13 compute-1 sudo[236304]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:14:13 compute-1 sudo[236304]: pam_unix(sudo:session): session closed for user root
Dec 06 10:14:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:14:14 compute-1 ceph-mon[79770]: pgmap v994: 337 pgs: 337 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.6 MiB/s rd, 1.8 MiB/s wr, 90 op/s
Dec 06 10:14:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:14:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:14.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:14:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:15.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:16 compute-1 ceph-mon[79770]: pgmap v995: 337 pgs: 337 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.6 MiB/s rd, 1.8 MiB/s wr, 90 op/s
Dec 06 10:14:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101416 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:14:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:16.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:14:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:17.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:14:18 compute-1 ceph-mon[79770]: pgmap v996: 337 pgs: 337 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.6 MiB/s rd, 1.8 MiB/s wr, 90 op/s
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.558382) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058558663, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1434, "num_deletes": 503, "total_data_size": 2830874, "memory_usage": 2884920, "flush_reason": "Manual Compaction"}
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058574492, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 1843544, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29768, "largest_seqno": 31197, "table_properties": {"data_size": 1837641, "index_size": 2723, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 16377, "raw_average_key_size": 19, "raw_value_size": 1823788, "raw_average_value_size": 2202, "num_data_blocks": 117, "num_entries": 828, "num_filter_entries": 828, "num_deletions": 503, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765015970, "oldest_key_time": 1765015970, "file_creation_time": 1765016058, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 16096 microseconds, and 7701 cpu microseconds.
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.574556) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 1843544 bytes OK
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.574585) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.576471) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.576485) EVENT_LOG_v1 {"time_micros": 1765016058576480, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.576502) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 2823129, prev total WAL file size 2823129, number of live WAL files 2.
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.577285) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(1800KB)], [57(16MB)]
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058577371, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 19368736, "oldest_snapshot_seqno": -1}
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5833 keys, 13151442 bytes, temperature: kUnknown
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058638946, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 13151442, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13113936, "index_size": 21844, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14597, "raw_key_size": 150788, "raw_average_key_size": 25, "raw_value_size": 13009737, "raw_average_value_size": 2230, "num_data_blocks": 875, "num_entries": 5833, "num_filter_entries": 5833, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765016058, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.639279) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 13151442 bytes
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.641865) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 314.1 rd, 213.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 16.7 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(17.6) write-amplify(7.1) OK, records in: 6858, records dropped: 1025 output_compression: NoCompression
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.641922) EVENT_LOG_v1 {"time_micros": 1765016058641900, "job": 34, "event": "compaction_finished", "compaction_time_micros": 61669, "compaction_time_cpu_micros": 29630, "output_level": 6, "num_output_files": 1, "total_output_size": 13151442, "num_input_records": 6858, "num_output_records": 5833, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058642683, "job": 34, "event": "table_file_deletion", "file_number": 59}
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016058647000, "job": 34, "event": "table_file_deletion", "file_number": 57}
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.577230) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.647045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.647052) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.647054) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.647055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:18 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:14:18.647057) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:14:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:18.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:14:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:19.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:20 compute-1 ceph-mon[79770]: pgmap v997: 337 pgs: 337 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Dec 06 10:14:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:20.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:14:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:21.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:14:21 compute-1 podman[236333]: 2025-12-06 10:14:21.813129879 +0000 UTC m=+0.115084217 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:14:22 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 10.
Dec 06 10:14:22 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 10:14:22 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.743s CPU time.
Dec 06 10:14:22 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 10:14:22 compute-1 ceph-mon[79770]: pgmap v998: 337 pgs: 337 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Dec 06 10:14:22 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1807636740' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:14:22 compute-1 podman[236410]: 2025-12-06 10:14:22.644998538 +0000 UTC m=+0.052322835 container create e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 10:14:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d8a693bbb4eb50dbf44c219f4881afeb798ed99392b659e6bc95aa1a478f7c4/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 06 10:14:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d8a693bbb4eb50dbf44c219f4881afeb798ed99392b659e6bc95aa1a478f7c4/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:14:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d8a693bbb4eb50dbf44c219f4881afeb798ed99392b659e6bc95aa1a478f7c4/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 10:14:22 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d8a693bbb4eb50dbf44c219f4881afeb798ed99392b659e6bc95aa1a478f7c4/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 10:14:22 compute-1 podman[236410]: 2025-12-06 10:14:22.702691755 +0000 UTC m=+0.110016072 container init e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True)
Dec 06 10:14:22 compute-1 podman[236410]: 2025-12-06 10:14:22.710363954 +0000 UTC m=+0.117688251 container start e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 10:14:22 compute-1 bash[236410]: e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b
Dec 06 10:14:22 compute-1 podman[236410]: 2025-12-06 10:14:22.625744772 +0000 UTC m=+0.033069119 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 10:14:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 06 10:14:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 06 10:14:22 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 10:14:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 06 10:14:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 06 10:14:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 06 10:14:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 06 10:14:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 06 10:14:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:14:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:22.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:23.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:14:24 compute-1 ceph-mon[79770]: pgmap v999: 337 pgs: 337 active+clean; 134 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Dec 06 10:14:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:14:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:24.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:25.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:25 compute-1 podman[236470]: 2025-12-06 10:14:25.763135263 +0000 UTC m=+0.069278973 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 06 10:14:26 compute-1 ceph-mon[79770]: pgmap v1000: 337 pgs: 337 active+clean; 134 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 358 KiB/s rd, 1.8 MiB/s wr, 38 op/s
Dec 06 10:14:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/955708754' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:14:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2572115795' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:14:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:26.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:14:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:27.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:14:28 compute-1 ceph-mon[79770]: pgmap v1001: 337 pgs: 337 active+clean; 134 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 358 KiB/s rd, 1.8 MiB/s wr, 38 op/s
Dec 06 10:14:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:28 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:14:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:28 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:14:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:28.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:14:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:29.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:29 compute-1 podman[236491]: 2025-12-06 10:14:29.814924406 +0000 UTC m=+0.098417765 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3)
Dec 06 10:14:30 compute-1 ceph-mon[79770]: pgmap v1002: 337 pgs: 337 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 683 KiB/s rd, 3.9 MiB/s wr, 108 op/s
Dec 06 10:14:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:30.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:14:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:31.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:14:32 compute-1 ceph-mon[79770]: pgmap v1003: 337 pgs: 337 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 346 KiB/s rd, 3.9 MiB/s wr, 97 op/s
Dec 06 10:14:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:32.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:14:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:33.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:14:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:14:33 compute-1 sudo[236513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:14:33 compute-1 sudo[236513]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:14:33 compute-1 sudo[236513]: pam_unix(sudo:session): session closed for user root
Dec 06 10:14:34 compute-1 ceph-mon[79770]: pgmap v1004: 337 pgs: 337 active+clean; 167 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 3.9 MiB/s wr, 168 op/s
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Dec 06 10:14:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Dec 06 10:14:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:34.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:35.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:35 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e88000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:36 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e780016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:36 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:36 compute-1 ceph-mon[79770]: pgmap v1005: 337 pgs: 337 active+clean; 167 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.2 MiB/s wr, 140 op/s
Dec 06 10:14:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:14:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:36.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:14:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:14:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:37.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:14:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:37 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101438 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Dec 06 10:14:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:38 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:38 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:38 compute-1 ceph-mon[79770]: pgmap v1006: 337 pgs: 337 active+clean; 167 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.2 MiB/s wr, 140 op/s
Dec 06 10:14:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:14:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:14:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:38.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:14:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:14:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:39.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:14:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:14:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:39 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e600016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:40 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:40 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:40 compute-1 ceph-mon[79770]: pgmap v1007: 337 pgs: 337 active+clean; 167 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.2 MiB/s wr, 140 op/s
Dec 06 10:14:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:40.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:41.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:41 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:42 compute-1 nova_compute[228576]: 2025-12-06 10:14:42.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:42 compute-1 nova_compute[228576]: 2025-12-06 10:14:42.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:14:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:42 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:42 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:42 compute-1 ceph-mon[79770]: pgmap v1008: 337 pgs: 337 active+clean; 167 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 70 op/s
Dec 06 10:14:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:42.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:14:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:43.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:14:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:43 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:14:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:44 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:44 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58000d20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:44 compute-1 ceph-mon[79770]: pgmap v1009: 337 pgs: 337 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 136 op/s
Dec 06 10:14:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:44.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:45.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:45 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:46 compute-1 ceph-mon[79770]: pgmap v1010: 337 pgs: 337 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 340 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Dec 06 10:14:46 compute-1 nova_compute[228576]: 2025-12-06 10:14:46.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:46 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60002140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:46 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:46.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:14:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:47.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:14:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3418864736' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:14:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3418864736' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:14:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2548017511' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:14:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4025771904' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:14:47 compute-1 nova_compute[228576]: 2025-12-06 10:14:47.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:47 compute-1 nova_compute[228576]: 2025-12-06 10:14:47.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:47 compute-1 nova_compute[228576]: 2025-12-06 10:14:47.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:47 compute-1 nova_compute[228576]: 2025-12-06 10:14:47.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:47 compute-1 nova_compute[228576]: 2025-12-06 10:14:47.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:47 compute-1 nova_compute[228576]: 2025-12-06 10:14:47.497 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:14:47 compute-1 nova_compute[228576]: 2025-12-06 10:14:47.497 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:47 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58001840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:47 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:14:47 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/97620197' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:14:48 compute-1 nova_compute[228576]: 2025-12-06 10:14:48.010 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:48 compute-1 nova_compute[228576]: 2025-12-06 10:14:48.172 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:14:48 compute-1 nova_compute[228576]: 2025-12-06 10:14:48.173 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5189MB free_disk=59.89735412597656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:14:48 compute-1 nova_compute[228576]: 2025-12-06 10:14:48.174 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:48 compute-1 nova_compute[228576]: 2025-12-06 10:14:48.174 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:48 compute-1 ceph-mon[79770]: pgmap v1011: 337 pgs: 337 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 340 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Dec 06 10:14:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/97620197' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:14:48 compute-1 nova_compute[228576]: 2025-12-06 10:14:48.275 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:14:48 compute-1 nova_compute[228576]: 2025-12-06 10:14:48.275 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:14:48 compute-1 nova_compute[228576]: 2025-12-06 10:14:48.295 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:14:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:48 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:14:48 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3271139588' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:14:48 compute-1 nova_compute[228576]: 2025-12-06 10:14:48.722 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:14:48 compute-1 nova_compute[228576]: 2025-12-06 10:14:48.728 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:14:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:48 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:48 compute-1 nova_compute[228576]: 2025-12-06 10:14:48.748 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:14:48 compute-1 nova_compute[228576]: 2025-12-06 10:14:48.750 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:14:48 compute-1 nova_compute[228576]: 2025-12-06 10:14:48.750 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:14:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:48.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:14:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:49.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:14:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3271139588' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:14:49 compute-1 nova_compute[228576]: 2025-12-06 10:14:49.744 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:49 compute-1 nova_compute[228576]: 2025-12-06 10:14:49.762 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:49 compute-1 nova_compute[228576]: 2025-12-06 10:14:49.762 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:49 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:50 compute-1 ceph-mon[79770]: pgmap v1012: 337 pgs: 337 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 340 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Dec 06 10:14:50 compute-1 nova_compute[228576]: 2025-12-06 10:14:50.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:50 compute-1 nova_compute[228576]: 2025-12-06 10:14:50.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:14:50 compute-1 nova_compute[228576]: 2025-12-06 10:14:50.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:14:50 compute-1 nova_compute[228576]: 2025-12-06 10:14:50.492 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:14:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:50 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58001840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:50 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:50.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:51.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:51 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:14:51.231 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:14:51 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:14:51.234 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:14:51 compute-1 nova_compute[228576]: 2025-12-06 10:14:51.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:51 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:52 compute-1 ceph-mon[79770]: pgmap v1013: 337 pgs: 337 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 340 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Dec 06 10:14:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:52 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:52 compute-1 sudo[236608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:14:52 compute-1 sudo[236608]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:14:52 compute-1 sudo[236608]: pam_unix(sudo:session): session closed for user root
Dec 06 10:14:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:52 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58001840 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:52 compute-1 sudo[236645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 06 10:14:52 compute-1 sudo[236645]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:14:52 compute-1 podman[236625]: 2025-12-06 10:14:52.810304361 +0000 UTC m=+0.112062042 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:14:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:52.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:14:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:53.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:14:53 compute-1 podman[236756]: 2025-12-06 10:14:53.322919497 +0000 UTC m=+0.068640728 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Dec 06 10:14:53 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 06 10:14:53 compute-1 podman[236756]: 2025-12-06 10:14:53.425537355 +0000 UTC m=+0.171258566 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 10:14:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:53 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:14:53 compute-1 podman[236874]: 2025-12-06 10:14:53.961214821 +0000 UTC m=+0.104100675 container exec 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:14:53 compute-1 podman[236874]: 2025-12-06 10:14:53.974595322 +0000 UTC m=+0.117481176 container exec_died 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:14:54 compute-1 sudo[236894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:14:54 compute-1 sudo[236894]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:14:54 compute-1 sudo[236894]: pam_unix(sudo:session): session closed for user root
Dec 06 10:14:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:14:54.289 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:14:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:14:54.289 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:14:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:14:54.290 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:14:54 compute-1 podman[236990]: 2025-12-06 10:14:54.306167622 +0000 UTC m=+0.060910068 container exec e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 06 10:14:54 compute-1 podman[236990]: 2025-12-06 10:14:54.314770944 +0000 UTC m=+0.069513370 container exec_died e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 06 10:14:54 compute-1 ceph-mon[79770]: pgmap v1014: 337 pgs: 337 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 347 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Dec 06 10:14:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:14:54 compute-1 podman[237055]: 2025-12-06 10:14:54.543428698 +0000 UTC m=+0.054923200 container exec 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec 06 10:14:54 compute-1 podman[237055]: 2025-12-06 10:14:54.55766896 +0000 UTC m=+0.069163472 container exec_died 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec 06 10:14:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:54 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:54 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:54 compute-1 podman[237122]: 2025-12-06 10:14:54.784852138 +0000 UTC m=+0.052800667 container exec c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, architecture=x86_64, vendor=Red Hat, Inc., name=keepalived, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, release=1793)
Dec 06 10:14:54 compute-1 podman[237122]: 2025-12-06 10:14:54.803433277 +0000 UTC m=+0.071381806 container exec_died c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, com.redhat.component=keepalived-container, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, release=1793, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, vendor=Red Hat, Inc., description=keepalived for Ceph)
Dec 06 10:14:54 compute-1 sudo[236645]: pam_unix(sudo:session): session closed for user root
Dec 06 10:14:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:54.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:54 compute-1 sudo[237155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:14:54 compute-1 sudo[237155]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:14:54 compute-1 sudo[237155]: pam_unix(sudo:session): session closed for user root
Dec 06 10:14:55 compute-1 sudo[237180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:14:55 compute-1 sudo[237180]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:14:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:55.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:55 compute-1 nova_compute[228576]: 2025-12-06 10:14:55.465 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:14:55 compute-1 sudo[237180]: pam_unix(sudo:session): session closed for user root
Dec 06 10:14:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:55 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:55 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:14:55 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:14:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3783245906' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:14:55 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:14:55 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:14:55 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec 06 10:14:56 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:14:56.236 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:14:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:56 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:56 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:56 compute-1 podman[237238]: 2025-12-06 10:14:56.753356195 +0000 UTC m=+0.056824606 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 06 10:14:56 compute-1 ceph-mon[79770]: pgmap v1015: 337 pgs: 337 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 6.9 KiB/s rd, 15 KiB/s wr, 1 op/s
Dec 06 10:14:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec 06 10:14:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:14:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:14:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:14:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:14:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:14:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:14:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:14:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/844916991' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:14:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:14:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:56.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:14:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:57.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:57 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:57 compute-1 ceph-mon[79770]: pgmap v1016: 337 pgs: 337 active+clean; 200 MiB data, 365 MiB used, 60 GiB / 60 GiB avail; 7.7 KiB/s rd, 17 KiB/s wr, 1 op/s
Dec 06 10:14:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:58 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:58 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:14:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:14:58.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:58 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2727823202' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:14:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:14:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:14:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:14:59.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:14:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:14:59 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:14:59 compute-1 ceph-mon[79770]: pgmap v1017: 337 pgs: 337 active+clean; 121 MiB data, 324 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 23 KiB/s wr, 33 op/s
Dec 06 10:15:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:00 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:00 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002cd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:00 compute-1 podman[237259]: 2025-12-06 10:15:00.75508945 +0000 UTC m=+0.063494751 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec 06 10:15:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:15:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:00.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:15:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:01.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:01 compute-1 sudo[237279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:15:01 compute-1 sudo[237279]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:15:01 compute-1 sudo[237279]: pam_unix(sudo:session): session closed for user root
Dec 06 10:15:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:01 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:01 compute-1 ceph-mon[79770]: pgmap v1018: 337 pgs: 337 active+clean; 121 MiB data, 324 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 9.1 KiB/s wr, 32 op/s
Dec 06 10:15:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:15:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:15:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:02 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:02 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:02.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:03.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:03 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:15:04 compute-1 ceph-mon[79770]: pgmap v1019: 337 pgs: 337 active+clean; 121 MiB data, 324 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 9.1 KiB/s wr, 32 op/s
Dec 06 10:15:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:04 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:04 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:04.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:15:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:05.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:15:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:05 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:06 compute-1 ceph-mon[79770]: pgmap v1020: 337 pgs: 337 active+clean; 121 MiB data, 322 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 7.2 KiB/s wr, 32 op/s
Dec 06 10:15:06 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3815512636' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:15:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:06 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:06 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:06.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:15:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:07.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:15:07 compute-1 ceph-mon[79770]: pgmap v1021: 337 pgs: 337 active+clean; 121 MiB data, 322 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 7.2 KiB/s wr, 32 op/s
Dec 06 10:15:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:07 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:08 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:08 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:15:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:08.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:09.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:09 compute-1 ceph-mon[79770]: pgmap v1022: 337 pgs: 337 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 7.7 KiB/s wr, 56 op/s
Dec 06 10:15:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:15:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:09 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:10 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78002000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:10 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:10.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:11.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:11 compute-1 ceph-mon[79770]: pgmap v1023: 337 pgs: 337 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.2 KiB/s wr, 28 op/s
Dec 06 10:15:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:11 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003dd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:12 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:12 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840013a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:12.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:15:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:13.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:15:13 compute-1 ceph-mon[79770]: pgmap v1024: 337 pgs: 337 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.2 KiB/s wr, 28 op/s
Dec 06 10:15:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:13 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:15:14 compute-1 sudo[237312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:15:14 compute-1 sudo[237312]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:15:14 compute-1 sudo[237312]: pam_unix(sudo:session): session closed for user root
Dec 06 10:15:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:14 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64001140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:14 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:14.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:15:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:15.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:15:15 compute-1 ceph-mon[79770]: pgmap v1025: 337 pgs: 337 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.2 KiB/s wr, 28 op/s
Dec 06 10:15:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:15 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:16 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:16 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:16.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:15:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:17.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:15:17 compute-1 ceph-mon[79770]: pgmap v1026: 337 pgs: 337 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Dec 06 10:15:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:17 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84001eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:18 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:18 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:15:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:18.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:15:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:19.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:15:19 compute-1 ceph-mon[79770]: pgmap v1027: 337 pgs: 337 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 06 10:15:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:19 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:20 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:20 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:15:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:20.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:15:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:21.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:21 compute-1 ceph-mon[79770]: pgmap v1028: 337 pgs: 337 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:15:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:21 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:22 compute-1 sshd-session[237341]: banner exchange: Connection from 3.131.215.38 port 40898: invalid format
Dec 06 10:15:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:22.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:15:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:23.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:15:23 compute-1 podman[237344]: 2025-12-06 10:15:23.78825523 +0000 UTC m=+0.088154241 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 06 10:15:23 compute-1 ceph-mon[79770]: pgmap v1029: 337 pgs: 337 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:15:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:23 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:15:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:24 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:24 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64002560 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:15:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:24.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:25.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:25 compute-1 ceph-mon[79770]: pgmap v1030: 337 pgs: 337 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Dec 06 10:15:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:25 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64002560 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:26 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e540016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:26 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840027d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:27.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:27.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:27 compute-1 podman[237373]: 2025-12-06 10:15:27.758384674 +0000 UTC m=+0.054769225 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 06 10:15:27 compute-1 ceph-mon[79770]: pgmap v1031: 337 pgs: 337 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:15:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:27 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64002560 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:28 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:28 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e540016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:15:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:29.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:29.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:29 compute-1 ceph-mon[79770]: pgmap v1032: 337 pgs: 337 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 10:15:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3798072465' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:15:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:29 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84003b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:30 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:30 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:31.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:15:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:31.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:15:31 compute-1 podman[237394]: 2025-12-06 10:15:31.747399556 +0000 UTC m=+0.054033678 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible)
Dec 06 10:15:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:31 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e540016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:31 compute-1 ceph-mon[79770]: pgmap v1033: 337 pgs: 337 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:15:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:32 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84003b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:32 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:33.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:15:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:33.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:15:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:33 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:33 compute-1 ceph-mon[79770]: pgmap v1034: 337 pgs: 337 active+clean; 41 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:15:33 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/125323094' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:15:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:15:34 compute-1 sudo[237416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:15:34 compute-1 sudo[237416]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:15:34 compute-1 sudo[237416]: pam_unix(sudo:session): session closed for user root
Dec 06 10:15:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84003b20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/174551433' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Dec 06 10:15:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:35.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:15:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:35.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:15:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:35 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:35 compute-1 ceph-mon[79770]: pgmap v1035: 337 pgs: 337 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:15:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:36 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:36 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:37.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:15:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:37.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:15:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:37 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:37 compute-1 ceph-mon[79770]: pgmap v1036: 337 pgs: 337 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Dec 06 10:15:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:38 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:38 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:15:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:39.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:15:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:39.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:15:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:39 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:39 compute-1 ceph-mon[79770]: pgmap v1037: 337 pgs: 337 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 638 KiB/s rd, 1.8 MiB/s wr, 58 op/s
Dec 06 10:15:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:15:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:40 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:40 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:41.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:15:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:41.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:15:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:41 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:41 compute-1 ceph-mon[79770]: pgmap v1038: 337 pgs: 337 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 638 KiB/s rd, 1.8 MiB/s wr, 58 op/s
Dec 06 10:15:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:42 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:42 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:43.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:43.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:43 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:15:43 compute-1 ceph-mon[79770]: pgmap v1039: 337 pgs: 337 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 638 KiB/s rd, 1.8 MiB/s wr, 58 op/s
Dec 06 10:15:44 compute-1 nova_compute[228576]: 2025-12-06 10:15:44.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:44 compute-1 nova_compute[228576]: 2025-12-06 10:15:44.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:15:44 compute-1 nova_compute[228576]: 2025-12-06 10:15:44.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:44 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004070 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:44 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:15:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:45.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:15:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:15:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:45.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:15:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:45 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:46 compute-1 nova_compute[228576]: 2025-12-06 10:15:46.480 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:46 compute-1 nova_compute[228576]: 2025-12-06 10:15:46.481 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:46 compute-1 nova_compute[228576]: 2025-12-06 10:15:46.481 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:15:46 compute-1 nova_compute[228576]: 2025-12-06 10:15:46.509 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:15:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:46 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:46 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:46 compute-1 ceph-mon[79770]: pgmap v1040: 337 pgs: 337 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Dec 06 10:15:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:47.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:47.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:47 compute-1 nova_compute[228576]: 2025-12-06 10:15:47.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:47 compute-1 nova_compute[228576]: 2025-12-06 10:15:47.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:15:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3121417538' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:15:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3121417538' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:15:47 compute-1 ceph-mon[79770]: pgmap v1041: 337 pgs: 337 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Dec 06 10:15:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3803708120' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:15:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:47 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:48 compute-1 nova_compute[228576]: 2025-12-06 10:15:48.494 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:48 compute-1 nova_compute[228576]: 2025-12-06 10:15:48.494 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:48 compute-1 nova_compute[228576]: 2025-12-06 10:15:48.539 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:48 compute-1 nova_compute[228576]: 2025-12-06 10:15:48.540 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:48 compute-1 nova_compute[228576]: 2025-12-06 10:15:48.540 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:48 compute-1 nova_compute[228576]: 2025-12-06 10:15:48.540 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:15:48 compute-1 nova_compute[228576]: 2025-12-06 10:15:48.541 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:15:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:48 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:48 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3505482668' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:15:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:15:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:15:48 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2620383535' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:15:49 compute-1 nova_compute[228576]: 2025-12-06 10:15:49.013 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:15:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:49.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:49.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:49 compute-1 nova_compute[228576]: 2025-12-06 10:15:49.213 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:15:49 compute-1 nova_compute[228576]: 2025-12-06 10:15:49.214 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5193MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:15:49 compute-1 nova_compute[228576]: 2025-12-06 10:15:49.215 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:49 compute-1 nova_compute[228576]: 2025-12-06 10:15:49.215 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:49 compute-1 nova_compute[228576]: 2025-12-06 10:15:49.777 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:15:49 compute-1 nova_compute[228576]: 2025-12-06 10:15:49.778 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:15:49 compute-1 nova_compute[228576]: 2025-12-06 10:15:49.837 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing inventories for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:15:49 compute-1 ceph-mon[79770]: pgmap v1042: 337 pgs: 337 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Dec 06 10:15:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2620383535' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:15:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:49 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0040b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:49 compute-1 nova_compute[228576]: 2025-12-06 10:15:49.905 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating ProviderTree inventory for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:15:49 compute-1 nova_compute[228576]: 2025-12-06 10:15:49.906 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating inventory in ProviderTree for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:15:49 compute-1 nova_compute[228576]: 2025-12-06 10:15:49.932 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing aggregate associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:15:49 compute-1 nova_compute[228576]: 2025-12-06 10:15:49.952 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing trait associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, traits: COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AESNI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SHA,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_BMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:15:49 compute-1 nova_compute[228576]: 2025-12-06 10:15:49.968 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:15:50 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:15:50 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4091665223' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:15:50 compute-1 nova_compute[228576]: 2025-12-06 10:15:50.476 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:15:50 compute-1 nova_compute[228576]: 2025-12-06 10:15:50.483 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:15:50 compute-1 nova_compute[228576]: 2025-12-06 10:15:50.503 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:15:50 compute-1 nova_compute[228576]: 2025-12-06 10:15:50.506 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:15:50 compute-1 nova_compute[228576]: 2025-12-06 10:15:50.507 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:50 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:50 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:50 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/4091665223' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:15:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:51.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:51.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:51 compute-1 nova_compute[228576]: 2025-12-06 10:15:51.485 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:51 compute-1 nova_compute[228576]: 2025-12-06 10:15:51.486 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:51 compute-1 nova_compute[228576]: 2025-12-06 10:15:51.486 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:51 compute-1 ceph-mon[79770]: pgmap v1043: 337 pgs: 337 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 1.3 MiB/s rd, 43 op/s
Dec 06 10:15:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:51 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:52 compute-1 nova_compute[228576]: 2025-12-06 10:15:52.472 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:52 compute-1 nova_compute[228576]: 2025-12-06 10:15:52.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:15:52 compute-1 nova_compute[228576]: 2025-12-06 10:15:52.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:15:52 compute-1 nova_compute[228576]: 2025-12-06 10:15:52.488 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:15:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:52 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0040d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:52 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:53.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:15:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:53.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:15:53 compute-1 ceph-mon[79770]: pgmap v1044: 337 pgs: 337 active+clean; 88 MiB data, 293 MiB used, 60 GiB / 60 GiB avail; 1.3 MiB/s rd, 43 op/s
Dec 06 10:15:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:53 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84004830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:15:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:15:54.290 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:15:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:15:54.291 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:15:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:15:54.291 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:15:54 compute-1 sudo[237497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:15:54 compute-1 sudo[237497]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:15:54 compute-1 sudo[237497]: pam_unix(sudo:session): session closed for user root
Dec 06 10:15:54 compute-1 podman[237521]: 2025-12-06 10:15:54.39198066 +0000 UTC m=+0.081570058 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:15:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:54 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:54 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:15:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:55.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:15:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:55.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:15:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:55 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:55 compute-1 ceph-mon[79770]: pgmap v1045: 337 pgs: 337 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 1.6 MiB/s rd, 2.1 MiB/s wr, 107 op/s
Dec 06 10:15:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3624299820' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:15:56 compute-1 nova_compute[228576]: 2025-12-06 10:15:56.480 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:15:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:56 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:56 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1552888335' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:15:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:57.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:57.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:57 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:57 compute-1 ceph-mon[79770]: pgmap v1046: 337 pgs: 337 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Dec 06 10:15:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:58 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:58 compute-1 podman[237552]: 2025-12-06 10:15:58.778993562 +0000 UTC m=+0.073100978 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:15:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:58 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:15:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:15:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:15:59.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:15:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:15:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:15:59.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:15:59 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:15:59 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:00 compute-1 ceph-mon[79770]: pgmap v1047: 337 pgs: 337 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 326 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 06 10:16:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:00 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:00 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:00 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:16:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:01.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:16:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:01.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:01 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:16:01.401 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:16:01 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:16:01.402 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:16:01 compute-1 sudo[237573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:16:01 compute-1 sudo[237573]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:16:01 compute-1 sudo[237573]: pam_unix(sudo:session): session closed for user root
Dec 06 10:16:01 compute-1 sudo[237598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:16:01 compute-1 sudo[237598]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:16:01 compute-1 podman[237622]: 2025-12-06 10:16:01.883584784 +0000 UTC m=+0.062250560 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:16:01 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:01 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60002690 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:02 compute-1 ceph-mon[79770]: pgmap v1048: 337 pgs: 337 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 323 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Dec 06 10:16:02 compute-1 sudo[237598]: pam_unix(sudo:session): session closed for user root
Dec 06 10:16:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:02 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:02 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:16:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:03.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:16:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:16:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:03.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:16:03 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:03 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:16:04 compute-1 ceph-mon[79770]: pgmap v1049: 337 pgs: 337 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 323 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Dec 06 10:16:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:04 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:04 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:16:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:05.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:16:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:16:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:05.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:16:05 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:05 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:06 compute-1 ceph-mon[79770]: pgmap v1050: 337 pgs: 337 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 323 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Dec 06 10:16:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:16:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:16:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:16:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:16:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:16:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:16:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:16:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:16:06 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:16:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:06 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:06 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:06 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:07.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:07 compute-1 ceph-mon[79770]: pgmap v1051: 337 pgs: 337 active+clean; 121 MiB data, 336 MiB used, 60 GiB / 60 GiB avail; 955 B/s rd, 15 KiB/s wr, 1 op/s
Dec 06 10:16:07 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2500206167' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:16:07 compute-1 ceph-mon[79770]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s))
Dec 06 10:16:07 compute-1 ceph-mon[79770]: Cluster is now healthy
Dec 06 10:16:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:16:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:07.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:16:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:07 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:08 compute-1 ceph-mon[79770]: pgmap v1052: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 16 KiB/s wr, 29 op/s
Dec 06 10:16:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:08 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:08 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:08 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:16:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:09.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:09.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:09 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:09 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003240 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:16:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:16:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:10 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:10 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:11 compute-1 ceph-mon[79770]: pgmap v1053: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 3.2 KiB/s wr, 28 op/s
Dec 06 10:16:11 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:16:11 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:16:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:11.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:11 compute-1 sudo[237681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:16:11 compute-1 sudo[237681]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:16:11 compute-1 sudo[237681]: pam_unix(sudo:session): session closed for user root
Dec 06 10:16:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:11.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:11 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:16:11.403 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:16:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:11 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:12 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e600033e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:12 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:13 compute-1 ceph-mon[79770]: pgmap v1054: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 3.2 KiB/s wr, 28 op/s
Dec 06 10:16:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:16:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:13.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:16:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:16:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:13.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:16:13 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:13 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:16:14 compute-1 sudo[237708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:16:14 compute-1 sudo[237708]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:16:14 compute-1 sudo[237708]: pam_unix(sudo:session): session closed for user root
Dec 06 10:16:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:14 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:14 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:14 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:15 compute-1 ceph-mon[79770]: pgmap v1055: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 3.2 KiB/s wr, 28 op/s
Dec 06 10:16:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:16:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:15.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:16:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:15.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:15 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:15 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:16 compute-1 ceph-mon[79770]: pgmap v1056: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 06 10:16:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:16 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:16 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:16 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:17.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:17.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:17 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101617 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:16:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:18 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:18 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:18 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:16:18 compute-1 ceph-mon[79770]: pgmap v1057: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Dec 06 10:16:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:19.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:19.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:19 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:19 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e54003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:20 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:20 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:20 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64003660 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:20 compute-1 ceph-mon[79770]: pgmap v1058: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:16:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:16:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:21.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:16:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:21.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:21 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:21 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:22 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:22 compute-1 ceph-mon[79770]: pgmap v1059: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:16:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:16:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:23.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:16:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:16:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:23.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:16:23 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:23 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:16:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:16:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:24 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:24 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:24 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:24 compute-1 podman[237741]: 2025-12-06 10:16:24.825702687 +0000 UTC m=+0.129256627 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:16:25 compute-1 ceph-mon[79770]: pgmap v1060: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Dec 06 10:16:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:25.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:25.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:25 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:25 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:26 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:26 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:26 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:27 compute-1 ceph-mon[79770]: pgmap v1061: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 10:16:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:27.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:27.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:27 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:28 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:28 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:28 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:16:29 compute-1 ceph-mon[79770]: pgmap v1062: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:16:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:29.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:29.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:29 compute-1 sshd-session[237735]: Connection closed by 3.131.215.38 port 41882 [preauth]
Dec 06 10:16:29 compute-1 podman[237770]: 2025-12-06 10:16:29.78004495 +0000 UTC m=+0.077125738 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 06 10:16:29 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:29 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:30 compute-1 ceph-mon[79770]: pgmap v1063: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 10:16:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:30 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e84001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:30 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:30 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:31.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:31.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:31 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:31 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0014d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:32 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:32 compute-1 podman[237791]: 2025-12-06 10:16:32.74597706 +0000 UTC m=+0.053804691 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:16:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:32 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:32 compute-1 ceph-mon[79770]: pgmap v1064: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 10:16:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:33.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:33.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:33 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:33 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:16:34 compute-1 sudo[237812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:16:34 compute-1 sudo[237812]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:16:34 compute-1 sudo[237812]: pam_unix(sudo:session): session closed for user root
Dec 06 10:16:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:34 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:34 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:34 compute-1 ceph-mon[79770]: pgmap v1065: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:16:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:35.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:35.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:35 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:35 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:36 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:36 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:36 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:37 compute-1 ceph-mon[79770]: pgmap v1066: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 10:16:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:37.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:37.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:37 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:38 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:38 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:38 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:16:39 compute-1 ceph-mon[79770]: pgmap v1067: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Dec 06 10:16:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:16:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:16:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:39.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:16:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:39.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:39 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:39 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:40 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101640 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:16:40 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:40 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:41 compute-1 ceph-mon[79770]: pgmap v1068: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 10:16:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:41.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:41.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:41 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:41 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:42 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:42 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:43 compute-1 ceph-mon[79770]: pgmap v1069: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 10:16:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:16:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:43.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:16:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:43.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:43 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:43 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:43 compute-1 sshd-session[237841]: banner exchange: Connection from 3.131.215.38 port 47542: invalid format
Dec 06 10:16:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:16:44 compute-1 ceph-mon[79770]: pgmap v1070: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 10:16:44 compute-1 nova_compute[228576]: 2025-12-06 10:16:44.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:44 compute-1 nova_compute[228576]: 2025-12-06 10:16:44.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:16:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:44 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:44 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:44 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:45.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:45.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:45 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:45 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3526346481' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:16:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3526346481' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:16:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:46 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:46 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:46 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e840030a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:47 compute-1 ceph-mon[79770]: pgmap v1071: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 06 10:16:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:16:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:47.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:16:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:16:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:47.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:16:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:47 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3098041809' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:16:48 compute-1 nova_compute[228576]: 2025-12-06 10:16:48.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:48 compute-1 nova_compute[228576]: 2025-12-06 10:16:48.505 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:48 compute-1 nova_compute[228576]: 2025-12-06 10:16:48.505 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:48 compute-1 nova_compute[228576]: 2025-12-06 10:16:48.530 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:16:48 compute-1 nova_compute[228576]: 2025-12-06 10:16:48.530 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:16:48 compute-1 nova_compute[228576]: 2025-12-06 10:16:48.530 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:16:48 compute-1 nova_compute[228576]: 2025-12-06 10:16:48.530 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:16:48 compute-1 nova_compute[228576]: 2025-12-06 10:16:48.531 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:16:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:48 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c001670 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:48 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:48 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:16:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:16:48 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2009192655' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:16:49 compute-1 nova_compute[228576]: 2025-12-06 10:16:49.011 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:16:49 compute-1 ceph-mon[79770]: pgmap v1072: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 10:16:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3494624022' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:16:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2009192655' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:16:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:49.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:49 compute-1 nova_compute[228576]: 2025-12-06 10:16:49.156 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:16:49 compute-1 nova_compute[228576]: 2025-12-06 10:16:49.157 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5175MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:16:49 compute-1 nova_compute[228576]: 2025-12-06 10:16:49.158 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:16:49 compute-1 nova_compute[228576]: 2025-12-06 10:16:49.158 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:16:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:49.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:49 compute-1 nova_compute[228576]: 2025-12-06 10:16:49.227 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:16:49 compute-1 nova_compute[228576]: 2025-12-06 10:16:49.228 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:16:49 compute-1 nova_compute[228576]: 2025-12-06 10:16:49.248 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:16:49 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:16:49 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4057191728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:16:49 compute-1 nova_compute[228576]: 2025-12-06 10:16:49.699 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:16:49 compute-1 nova_compute[228576]: 2025-12-06 10:16:49.706 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:16:49 compute-1 nova_compute[228576]: 2025-12-06 10:16:49.720 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:16:49 compute-1 nova_compute[228576]: 2025-12-06 10:16:49.721 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:16:49 compute-1 nova_compute[228576]: 2025-12-06 10:16:49.722 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:16:49 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:49 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:50 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/4057191728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:16:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:50 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:50 compute-1 nova_compute[228576]: 2025-12-06 10:16:50.688 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:50 compute-1 nova_compute[228576]: 2025-12-06 10:16:50.689 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:50 compute-1 nova_compute[228576]: 2025-12-06 10:16:50.689 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:50 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:50 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e58003650 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:51 compute-1 ceph-mon[79770]: pgmap v1073: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 06 10:16:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:51.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:16:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:51.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:16:51 compute-1 nova_compute[228576]: 2025-12-06 10:16:51.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:51 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:51 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0048d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:52 compute-1 ceph-mon[79770]: pgmap v1074: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 06 10:16:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:52 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:52 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78001a00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:53.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:53.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:53 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:53 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78001a00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:16:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:16:54.291 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:16:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:16:54.291 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:16:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:16:54.291 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:16:54 compute-1 nova_compute[228576]: 2025-12-06 10:16:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:54 compute-1 nova_compute[228576]: 2025-12-06 10:16:54.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:16:54 compute-1 nova_compute[228576]: 2025-12-06 10:16:54.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:16:54 compute-1 nova_compute[228576]: 2025-12-06 10:16:54.500 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:16:54 compute-1 sudo[237894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:16:54 compute-1 sudo[237894]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:16:54 compute-1 sudo[237894]: pam_unix(sudo:session): session closed for user root
Dec 06 10:16:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:54 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:54 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:54 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0048d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:54 compute-1 ceph-mon[79770]: pgmap v1075: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 06 10:16:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:16:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:55.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:55.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:55 compute-1 podman[237919]: 2025-12-06 10:16:55.802255348 +0000 UTC m=+0.107267694 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Dec 06 10:16:55 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:55 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1188976781' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:16:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:56 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e78001a00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:56 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:56 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e60003e70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:57 compute-1 ceph-mon[79770]: pgmap v1076: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 06 10:16:57 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3022049998' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:16:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:16:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:57.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:16:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:16:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:57.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:16:57 compute-1 nova_compute[228576]: 2025-12-06 10:16:57.492 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:16:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:57 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e6c0048d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Dec 06 10:16:58 compute-1 kernel: ganesha.nfsd[237309]: segfault at 50 ip 00007f9f37c5f32e sp 00007f9ef57f9210 error 4 in libntirpc.so.5.8[7f9f37c44000+2c000] likely on CPU 1 (core 0, socket 1)
Dec 06 10:16:58 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Dec 06 10:16:58 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[236427]: 06/12/2025 10:16:58 : epoch 693401fe : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9e64004370 fd 38 proxy ignored for local
Dec 06 10:16:58 compute-1 systemd[1]: Started Process Core Dump (PID 237947/UID 0).
Dec 06 10:16:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:16:59 compute-1 ceph-mon[79770]: pgmap v1077: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Dec 06 10:16:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:16:59.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:16:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:16:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:16:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:16:59.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:00 compute-1 systemd-coredump[237948]: Process 236431 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 57:
                                                    #0  0x00007f9f37c5f32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Dec 06 10:17:00 compute-1 systemd[1]: systemd-coredump@10-237947-0.service: Deactivated successfully.
Dec 06 10:17:00 compute-1 systemd[1]: systemd-coredump@10-237947-0.service: Consumed 1.507s CPU time.
Dec 06 10:17:00 compute-1 podman[237955]: 2025-12-06 10:17:00.391368037 +0000 UTC m=+0.033829528 container died e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325)
Dec 06 10:17:00 compute-1 systemd[1]: var-lib-containers-storage-overlay-8d8a693bbb4eb50dbf44c219f4881afeb798ed99392b659e6bc95aa1a478f7c4-merged.mount: Deactivated successfully.
Dec 06 10:17:00 compute-1 podman[237955]: 2025-12-06 10:17:00.445373492 +0000 UTC m=+0.087834983 container remove e1672b0d6c65fac4dad8abe557390306766af98aa8142ef347d33cd29910d02b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Dec 06 10:17:00 compute-1 podman[237954]: 2025-12-06 10:17:00.449227578 +0000 UTC m=+0.082245695 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 10:17:00 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Main process exited, code=exited, status=139/n/a
Dec 06 10:17:00 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Failed with result 'exit-code'.
Dec 06 10:17:00 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.790s CPU time.
Dec 06 10:17:01 compute-1 ceph-mon[79770]: pgmap v1078: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 06 10:17:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:01.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:01.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:03 compute-1 ceph-mon[79770]: pgmap v1079: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 06 10:17:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:03.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:03 compute-1 podman[238018]: 2025-12-06 10:17:03.157035607 +0000 UTC m=+0.066106076 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec 06 10:17:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:03.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:17:04 compute-1 ceph-mon[79770]: pgmap v1080: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 06 10:17:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [WARNING] 339/101704 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Dec 06 10:17:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [NOTICE] 339/101704 (4) : haproxy version is 2.3.17-d1c9119
Dec 06 10:17:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [NOTICE] 339/101704 (4) : path to executable is /usr/local/sbin/haproxy
Dec 06 10:17:04 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd[84950]: [ALERT] 339/101704 (4) : backend 'backend' has no server available!
Dec 06 10:17:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:05.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:05.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:07 compute-1 ceph-mon[79770]: pgmap v1081: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 06 10:17:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:07.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:07.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:17:09 compute-1 ceph-mon[79770]: pgmap v1082: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Dec 06 10:17:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:17:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:09.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:09.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:10 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Scheduled restart job, restart counter is at 11.
Dec 06 10:17:10 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 10:17:10 compute-1 systemd[1]: ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258@nfs.cephfs.0.0.compute-1.djsnbu.service: Consumed 1.790s CPU time.
Dec 06 10:17:10 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258...
Dec 06 10:17:10 compute-1 podman[238093]: 2025-12-06 10:17:10.915850758 +0000 UTC m=+0.041348193 container create 044fb2629765feb8ffd5fd258951cd4533635db83b13cd8de7feeb48e81aeb97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 06 10:17:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb22caed735b4396606c1376888db90490624613a4aa87d53e4dc197468a9281/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb22caed735b4396606c1376888db90490624613a4aa87d53e4dc197468a9281/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb22caed735b4396606c1376888db90490624613a4aa87d53e4dc197468a9281/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:10 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb22caed735b4396606c1376888db90490624613a4aa87d53e4dc197468a9281/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.djsnbu-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Dec 06 10:17:10 compute-1 podman[238093]: 2025-12-06 10:17:10.977720978 +0000 UTC m=+0.103218433 container init 044fb2629765feb8ffd5fd258951cd4533635db83b13cd8de7feeb48e81aeb97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Dec 06 10:17:10 compute-1 podman[238093]: 2025-12-06 10:17:10.982426274 +0000 UTC m=+0.107923709 container start 044fb2629765feb8ffd5fd258951cd4533635db83b13cd8de7feeb48e81aeb97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Dec 06 10:17:10 compute-1 bash[238093]: 044fb2629765feb8ffd5fd258951cd4533635db83b13cd8de7feeb48e81aeb97
Dec 06 10:17:10 compute-1 podman[238093]: 2025-12-06 10:17:10.897522025 +0000 UTC m=+0.023019480 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Dec 06 10:17:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:10 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Dec 06 10:17:10 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:10 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Dec 06 10:17:10 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.djsnbu for 5ecd3f74-dade-5fc4-92ce-8950ae424258.
Dec 06 10:17:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Dec 06 10:17:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Dec 06 10:17:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Dec 06 10:17:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Dec 06 10:17:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Dec 06 10:17:11 compute-1 ceph-mon[79770]: pgmap v1083: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail
Dec 06 10:17:11 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:17:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:11.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:11.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:11 compute-1 sudo[238150]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:17:11 compute-1 sudo[238150]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:17:11 compute-1 sudo[238150]: pam_unix(sudo:session): session closed for user root
Dec 06 10:17:11 compute-1 sudo[238175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:17:11 compute-1 sudo[238175]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:17:11 compute-1 sudo[238175]: pam_unix(sudo:session): session closed for user root
Dec 06 10:17:12 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:17:12 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:17:12 compute-1 sshd-session[238234]: Accepted publickey for zuul from 192.168.122.10 port 38894 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 10:17:12 compute-1 systemd-logind[788]: New session 55 of user zuul.
Dec 06 10:17:12 compute-1 systemd[1]: Started Session 55 of User zuul.
Dec 06 10:17:12 compute-1 sshd-session[238234]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 10:17:12 compute-1 sudo[238238]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 06 10:17:13 compute-1 sudo[238238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:17:13 compute-1 ceph-mon[79770]: pgmap v1084: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail
Dec 06 10:17:13 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:17:13 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:17:13 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:17:13 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:17:13 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:17:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:13.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:13.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:17:14 compute-1 ceph-mon[79770]: pgmap v1085: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:17:14 compute-1 sudo[238338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:17:14 compute-1 sudo[238338]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:17:14 compute-1 sudo[238338]: pam_unix(sudo:session): session closed for user root
Dec 06 10:17:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:15.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:15.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:16 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec 06 10:17:16 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3986376339' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 06 10:17:16 compute-1 ceph-mon[79770]: from='client.25475 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:16 compute-1 ceph-mon[79770]: from='client.17010 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:16 compute-1 ceph-mon[79770]: from='client.17016 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:16 compute-1 ceph-mon[79770]: pgmap v1086: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Dec 06 10:17:16 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3986376339' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 06 10:17:16 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3306672627' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 06 10:17:16 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1630414090' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 06 10:17:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:17.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:17:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:17:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:17:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:17.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:17 compute-1 sudo[238554]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:17:17 compute-1 sudo[238554]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:17:17 compute-1 sudo[238554]: pam_unix(sudo:session): session closed for user root
Dec 06 10:17:18 compute-1 ceph-mon[79770]: from='client.25484 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:18 compute-1 ceph-mon[79770]: from='client.26341 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:18 compute-1 ceph-mon[79770]: from='client.17028 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:18 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:17:18 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:17:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:17:19 compute-1 ceph-mon[79770]: pgmap v1087: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 255 B/s wr, 1 op/s
Dec 06 10:17:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:19.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:19.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:21 compute-1 ceph-mon[79770]: pgmap v1088: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 255 B/s wr, 1 op/s
Dec 06 10:17:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:21.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:21.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:17:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:17:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:17:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:17:22 compute-1 ovs-vsctl[238652]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 06 10:17:23 compute-1 ceph-mon[79770]: pgmap v1089: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 255 B/s wr, 1 op/s
Dec 06 10:17:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:23.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:23 compute-1 virtqemud[228188]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 06 10:17:23 compute-1 virtqemud[228188]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 06 10:17:23 compute-1 virtqemud[228188]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 06 10:17:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:17:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:23.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:17:23 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: cache status {prefix=cache status} (starting...)
Dec 06 10:17:23 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:17:23 compute-1 lvm[238966]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 10:17:23 compute-1 lvm[238966]: VG ceph_vg0 finished
Dec 06 10:17:23 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: client ls {prefix=client ls} (starting...)
Dec 06 10:17:23 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:17:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:17:24 compute-1 ceph-mon[79770]: pgmap v1090: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 255 B/s wr, 1 op/s
Dec 06 10:17:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:17:24 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: damage ls {prefix=damage ls} (starting...)
Dec 06 10:17:24 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:17:24 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump loads {prefix=dump loads} (starting...)
Dec 06 10:17:24 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:17:24 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec 06 10:17:24 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:17:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec 06 10:17:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:17:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:25.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:25 compute-1 ceph-mon[79770]: from='client.26362 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:25 compute-1 ceph-mon[79770]: from='client.17049 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:25 compute-1 ceph-mon[79770]: from='client.25508 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:25 compute-1 ceph-mon[79770]: from='client.26380 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:25 compute-1 ceph-mon[79770]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 06 10:17:25 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1668178014' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 06 10:17:25 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2295855214' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 06 10:17:25 compute-1 ceph-mon[79770]: from='client.17073 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:25 compute-1 ceph-mon[79770]: from='client.25523 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:25 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/439237568' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 06 10:17:25 compute-1 ceph-mon[79770]: from='client.26395 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:25 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/644308478' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:17:25 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1092890167' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:17:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec 06 10:17:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:17:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:25.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:25 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Dec 06 10:17:25 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1011187621' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:17:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec 06 10:17:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:17:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec 06 10:17:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:17:25 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Dec 06 10:17:25 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/901373131' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 06 10:17:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec 06 10:17:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:17:26 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: ops {prefix=ops} (starting...)
Dec 06 10:17:26 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:17:26 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec 06 10:17:26 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/675051402' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mon[79770]: from='client.17091 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mon[79770]: from='client.25532 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1011187621' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mon[79770]: from='client.26410 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3220109428' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2707223938' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/901373131' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mon[79770]: from='client.17106 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mon[79770]: from='client.25556 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3753402528' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3340143316' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mon[79770]: pgmap v1091: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 170 B/s wr, 0 op/s
Dec 06 10:17:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1289716866' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3469772445' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec 06 10:17:26 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2227707575' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 06 10:17:26 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/430822803' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:17:26 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: session ls {prefix=session ls} (starting...)
Dec 06 10:17:26 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:17:26 compute-1 podman[239348]: 2025-12-06 10:17:26.806632968 +0000 UTC m=+0.106889985 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 06 10:17:26 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: status {prefix=status} (starting...)
Dec 06 10:17:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:17:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:17:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:17:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:17:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 06 10:17:27 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3824211357' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:17:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:27.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/675051402' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2227707575' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/741915433' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.26443 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.17148 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.25586 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3669643766' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/430822803' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.26467 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4128122182' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.17172 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.25592 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3824211357' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/187974181' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2688023797' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3176248467' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 06 10:17:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:27.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Dec 06 10:17:27 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3343059054' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 06 10:17:27 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1402062969' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec 06 10:17:27 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3965562810' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 06 10:17:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:17:27 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1821647303' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/957817502' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3343059054' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1402062969' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3246663567' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2900583648' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3581068770' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2175235751' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3965562810' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1821647303' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2898722618' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: pgmap v1092: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 170 B/s wr, 1 op/s
Dec 06 10:17:28 compute-1 ceph-mon[79770]: from='client.26524 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4240707576' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Dec 06 10:17:28 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/912603139' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec 06 10:17:28 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1283122563' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 06 10:17:28 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/357451755' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:17:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:17:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.002000049s ======
Dec 06 10:17:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:29.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000049s
Dec 06 10:17:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec 06 10:17:29 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/447826151' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 06 10:17:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:17:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:29.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:17:29 compute-1 ceph-mon[79770]: from='client.17232 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:29 compute-1 ceph-mon[79770]: from='client.25631 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3536510514' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 06 10:17:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3919289705' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:17:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2579086106' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 06 10:17:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/912603139' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 06 10:17:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1180736574' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 06 10:17:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1283122563' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 06 10:17:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3936739377' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:17:29 compute-1 ceph-mon[79770]: from='client.26560 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/357451755' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:17:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/98512507' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 06 10:17:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2248478302' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 06 10:17:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/447826151' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 06 10:17:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 06 10:17:29 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4036683984' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 06 10:17:30 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1483867891' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.17295 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.26587 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.25670 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2166791389' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.26605 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2884993703' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.25679 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.17319 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/4036683984' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2906548466' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: pgmap v1093: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.26626 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.25694 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1575270102' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.17337 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1483867891' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3684677759' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:17:30 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 06 10:17:30 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2197448062' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:24.064118+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77512704 unmapped: 1007616 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:25.064375+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: mgrc handle_mgr_map Got map version 35
Dec 06 10:17:30 compute-1 ceph-osd[77465]: mgrc handle_mgr_map Active mgr is now [v2:192.168.122.100:6800/3885409716,v1:192.168.122.100:6801/3885409716]
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77553664 unmapped: 966656 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:26.064556+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77553664 unmapped: 966656 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:27.064796+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924228 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77553664 unmapped: 966656 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:28.065035+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77553664 unmapped: 966656 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:29.065251+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77553664 unmapped: 966656 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:30.065540+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77553664 unmapped: 966656 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:31.065711+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77561856 unmapped: 958464 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:32.065900+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924228 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77570048 unmapped: 950272 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:33.066111+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77578240 unmapped: 942080 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:34.066280+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77578240 unmapped: 942080 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:35.066416+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77578240 unmapped: 942080 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:36.066577+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77586432 unmapped: 933888 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:37.066734+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924228 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77594624 unmapped: 925696 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:38.066970+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77594624 unmapped: 925696 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:39.067183+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:40.067399+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77602816 unmapped: 917504 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:41.067605+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77611008 unmapped: 909312 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:42.067797+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924228 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:43.067998+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb25ef5800 session 0x55fb24eda000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb25ef5400 session 0x55fb23db6f00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77619200 unmapped: 901120 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:44.068246+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77627392 unmapped: 892928 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:45.068416+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77627392 unmapped: 892928 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:46.068594+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77627392 unmapped: 892928 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:47.068780+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924228 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77651968 unmapped: 868352 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:48.069031+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77651968 unmapped: 868352 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:49.069260+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77660160 unmapped: 860160 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:50.069496+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77660160 unmapped: 860160 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:51.069730+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77668352 unmapped: 851968 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:52.069923+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 924228 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77676544 unmapped: 843776 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:53.070137+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca66000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77676544 unmapped: 843776 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:54.070387+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 37.000507355s of 37.167388916s, submitted: 29
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 786432 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:55.070989+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 786432 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:56.071648+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77733888 unmapped: 786432 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:57.074644+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922420 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:58.076842+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:44:59.077071+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77742080 unmapped: 778240 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:00.077751+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 745472 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:01.078078+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb24bd3000 session 0x55fb23db7e00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb235d5c00 session 0x55fb24e98f00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77774848 unmapped: 745472 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:02.078918+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922420 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 729088 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:03.079243+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 729088 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:04.079485+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77791232 unmapped: 729088 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:05.080024+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 720896 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:06.080312+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77799424 unmapped: 720896 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:07.080819+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922420 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77807616 unmapped: 712704 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:08.081174+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.814132690s of 13.842460632s, submitted: 8
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:09.081636+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77815808 unmapped: 704512 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:10.082072+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 696320 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:11.082329+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:12.082740+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 696320 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922120 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:13.082900+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77824000 unmapped: 696320 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:14.083464+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77832192 unmapped: 688128 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:15.083899+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77840384 unmapped: 679936 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:16.084293+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77856768 unmapped: 663552 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:17.085367+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77873152 unmapped: 647168 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922420 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:18.085692+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77873152 unmapped: 647168 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:19.086305+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77881344 unmapped: 638976 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.166560173s of 11.237159729s, submitted: 6
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:20.086755+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:21.087239+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77897728 unmapped: 622592 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:22.087505+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77905920 unmapped: 614400 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922420 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:23.087862+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:24.088122+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77914112 unmapped: 606208 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:25.088628+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:26.088823+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77922304 unmapped: 598016 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:27.089252+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:28.089438+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:29.089856+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77930496 unmapped: 589824 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:30.090129+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77938688 unmapped: 581632 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:31.090544+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77938688 unmapped: 581632 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:32.090945+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:33.091451+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77946880 unmapped: 573440 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:34.091655+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77955072 unmapped: 565248 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:35.091817+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77955072 unmapped: 565248 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:36.092218+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77955072 unmapped: 565248 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:37.092586+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77963264 unmapped: 557056 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:38.092849+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77971456 unmapped: 548864 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:39.093058+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77971456 unmapped: 548864 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:40.093406+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 540672 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:41.093621+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77979648 unmapped: 540672 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:42.093816+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:43.094099+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:44.094380+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77987840 unmapped: 532480 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:45.094532+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:46.094687+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 77996032 unmapped: 524288 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:47.095734+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78004224 unmapped: 516096 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:48.095903+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78004224 unmapped: 516096 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:49.096296+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:50.096531+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:51.096803+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78012416 unmapped: 507904 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:52.097041+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78020608 unmapped: 499712 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:53.097198+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78028800 unmapped: 491520 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:54.097361+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78028800 unmapped: 491520 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:55.097566+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78036992 unmapped: 483328 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:56.097754+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78036992 unmapped: 483328 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:57.098029+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78045184 unmapped: 475136 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:58.098190+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78045184 unmapped: 475136 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:45:59.098356+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78053376 unmapped: 466944 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:00.098603+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78053376 unmapped: 466944 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:01.098755+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78053376 unmapped: 466944 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:02.098899+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78061568 unmapped: 458752 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:03.099040+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78061568 unmapped: 458752 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:04.099246+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78061568 unmapped: 458752 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:05.099471+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78069760 unmapped: 450560 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:06.099678+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78069760 unmapped: 450560 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:07.099890+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78077952 unmapped: 442368 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26acba40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:08.100121+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78077952 unmapped: 442368 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:09.100407+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78077952 unmapped: 442368 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:10.101269+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78077952 unmapped: 442368 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:11.101422+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78086144 unmapped: 434176 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:12.101607+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78086144 unmapped: 434176 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:13.101710+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78102528 unmapped: 417792 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:14.101868+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78102528 unmapped: 417792 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:15.102170+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78102528 unmapped: 417792 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:16.102792+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 409600 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:17.103235+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78110720 unmapped: 409600 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922272 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:18.103623+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78118912 unmapped: 401408 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:19.103993+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78118912 unmapped: 401408 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:20.104307+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78127104 unmapped: 393216 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 60.789440155s of 61.372726440s, submitted: 5
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:21.104619+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78135296 unmapped: 385024 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:22.104945+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78135296 unmapped: 385024 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922404 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:23.105255+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 376832 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:24.105497+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 376832 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:25.105769+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26aca1e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78143488 unmapped: 376832 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:26.105937+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78151680 unmapped: 368640 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:27.106205+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78159872 unmapped: 360448 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922420 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:28.106358+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78159872 unmapped: 360448 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:29.106532+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78168064 unmapped: 352256 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:30.106726+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78168064 unmapped: 352256 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:31.106912+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78176256 unmapped: 344064 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:32.107443+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78176256 unmapped: 344064 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.027969360s of 12.041974068s, submitted: 4
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922288 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:33.109123+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78184448 unmapped: 335872 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:34.109318+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78184448 unmapped: 335872 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:35.109605+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 78192640 unmapped: 327680 heap: 78520320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:36.109757+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79241216 unmapped: 327680 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:37.109983+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79241216 unmapped: 327680 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 922420 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:38.110247+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79257600 unmapped: 311296 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:39.110629+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79257600 unmapped: 311296 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:40.110847+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79257600 unmapped: 311296 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:41.111078+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79265792 unmapped: 303104 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:42.111259+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79265792 unmapped: 303104 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923932 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:43.111494+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79265792 unmapped: 303104 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:44.111688+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 79273984 unmapped: 294912 heap: 79568896 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:45.111867+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.543926239s of 12.579975128s, submitted: 11
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80322560 unmapped: 294912 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:46.112040+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80330752 unmapped: 286720 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:47.112242+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80338944 unmapped: 278528 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923916 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:48.112409+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80330752 unmapped: 286720 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:49.112644+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80338944 unmapped: 278528 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:50.112894+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80338944 unmapped: 278528 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:51.113173+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80338944 unmapped: 278528 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:52.113472+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80347136 unmapped: 270336 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:53.113672+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923784 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80347136 unmapped: 270336 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:54.113951+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80347136 unmapped: 270336 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:55.114132+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80355328 unmapped: 262144 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:56.114308+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80355328 unmapped: 262144 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:57.114487+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80363520 unmapped: 253952 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb235d5c00 session 0x55fb2719dc20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:58.114694+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923784 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80363520 unmapped: 253952 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:46:59.114843+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80371712 unmapped: 245760 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:00.115079+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80371712 unmapped: 245760 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:01.115229+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80371712 unmapped: 245760 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:02.115398+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80379904 unmapped: 237568 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:03.115538+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923784 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80379904 unmapped: 237568 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:04.115730+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80388096 unmapped: 229376 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:05.115896+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80388096 unmapped: 229376 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:06.116098+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80388096 unmapped: 229376 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:07.116263+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80396288 unmapped: 221184 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:08.116580+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 923784 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80396288 unmapped: 221184 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:09.116773+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd3000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 23.654951096s of 23.672401428s, submitted: 5
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80404480 unmapped: 212992 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:10.116996+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80404480 unmapped: 212992 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:11.117237+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80404480 unmapped: 212992 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:12.117508+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80420864 unmapped: 196608 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:13.117686+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926956 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80420864 unmapped: 196608 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:14.117860+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80420864 unmapped: 196608 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:15.118054+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80429056 unmapped: 188416 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:16.118325+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80470016 unmapped: 147456 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:17.118670+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80478208 unmapped: 139264 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:18.119046+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926956 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80478208 unmapped: 139264 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:19.119298+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.962579727s of 10.001787186s, submitted: 11
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80494592 unmapped: 122880 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:20.119600+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80502784 unmapped: 114688 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:21.119900+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80502784 unmapped: 114688 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:22.120083+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80510976 unmapped: 106496 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:23.120261+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926349 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80519168 unmapped: 98304 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:24.120434+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80519168 unmapped: 98304 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:25.120717+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80527360 unmapped: 90112 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:26.120911+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80551936 unmapped: 65536 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:27.121084+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80560128 unmapped: 57344 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:28.121309+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80560128 unmapped: 57344 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:29.121559+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80560128 unmapped: 57344 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:30.121856+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80568320 unmapped: 49152 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:31.122013+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80568320 unmapped: 49152 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:32.122211+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80576512 unmapped: 40960 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:33.122389+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80576512 unmapped: 40960 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:34.122558+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80576512 unmapped: 40960 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:35.122770+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80584704 unmapped: 32768 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:36.122942+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80584704 unmapped: 32768 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:37.123075+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80601088 unmapped: 16384 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:38.123219+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80601088 unmapped: 16384 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:39.123379+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80601088 unmapped: 16384 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:40.123581+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80609280 unmapped: 8192 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:41.123720+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80609280 unmapped: 8192 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:42.123884+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80617472 unmapped: 0 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:43.124050+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80617472 unmapped: 0 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:44.124223+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80617472 unmapped: 0 heap: 80617472 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:45.124387+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80625664 unmapped: 1040384 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:46.124543+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80625664 unmapped: 1040384 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:47.124683+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80633856 unmapped: 1032192 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:48.124817+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80633856 unmapped: 1032192 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:49.124963+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80633856 unmapped: 1032192 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:50.125235+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80642048 unmapped: 1024000 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:51.125392+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80642048 unmapped: 1024000 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:52.125650+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80650240 unmapped: 1015808 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:53.125812+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80650240 unmapped: 1015808 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:54.125975+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80650240 unmapped: 1015808 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:55.126192+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80658432 unmapped: 1007616 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:56.126402+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80658432 unmapped: 1007616 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:57.126579+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80666624 unmapped: 999424 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:58.126736+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80666624 unmapped: 999424 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:47:59.126894+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80666624 unmapped: 999424 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:00.127113+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80674816 unmapped: 991232 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:01.127456+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80674816 unmapped: 991232 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:02.127701+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80683008 unmapped: 983040 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:03.127935+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80683008 unmapped: 983040 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:04.128162+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80683008 unmapped: 983040 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:05.128332+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80691200 unmapped: 974848 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:06.128534+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80691200 unmapped: 974848 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:07.128723+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80699392 unmapped: 966656 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:08.128889+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80699392 unmapped: 966656 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:09.129060+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80699392 unmapped: 966656 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:10.129341+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80707584 unmapped: 958464 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:11.129575+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80707584 unmapped: 958464 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:12.129771+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80715776 unmapped: 950272 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:13.129930+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80715776 unmapped: 950272 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:14.130104+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80723968 unmapped: 942080 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:15.130215+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80723968 unmapped: 942080 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:16.130435+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80723968 unmapped: 942080 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:17.130597+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80732160 unmapped: 933888 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:18.130758+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80732160 unmapped: 933888 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:19.130931+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80740352 unmapped: 925696 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:20.131223+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80740352 unmapped: 925696 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:21.131410+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80748544 unmapped: 917504 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:22.131597+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80748544 unmapped: 917504 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:23.131759+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80748544 unmapped: 917504 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:24.131926+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80756736 unmapped: 909312 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:25.132090+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80756736 unmapped: 909312 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:26.132270+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80764928 unmapped: 901120 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:27.132506+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80781312 unmapped: 884736 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:28.132666+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80781312 unmapped: 884736 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:29.132811+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80789504 unmapped: 876544 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:30.132998+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80789504 unmapped: 876544 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:31.133138+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80797696 unmapped: 868352 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:32.133280+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80797696 unmapped: 868352 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:33.133464+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80797696 unmapped: 868352 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:34.133635+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80805888 unmapped: 860160 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:35.133786+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80805888 unmapped: 860160 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:36.133908+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80805888 unmapped: 860160 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:37.134081+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80814080 unmapped: 851968 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:38.134254+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80814080 unmapped: 851968 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:39.134422+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80822272 unmapped: 843776 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:40.134622+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80822272 unmapped: 843776 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:41.134757+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:42.134982+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80822272 unmapped: 843776 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:43.135213+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80830464 unmapped: 835584 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb2719d2c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:44.135660+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80830464 unmapped: 835584 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:45.135986+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80830464 unmapped: 835584 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:46.136217+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80838656 unmapped: 827392 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:47.136379+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80846848 unmapped: 819200 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:48.136724+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80846848 unmapped: 819200 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:49.137299+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80846848 unmapped: 819200 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:50.137552+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80855040 unmapped: 811008 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:51.137761+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80855040 unmapped: 811008 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:52.138043+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80855040 unmapped: 811008 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:53.138209+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80863232 unmapped: 802816 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926217 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:54.138401+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80863232 unmapped: 802816 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 95.385993958s of 95.397544861s, submitted: 3
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:55.138595+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80871424 unmapped: 794624 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:56.138749+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80871424 unmapped: 794624 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:57.138879+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80871424 unmapped: 794624 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:58.139192+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80879616 unmapped: 786432 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926365 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:48:59.139518+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80879616 unmapped: 786432 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:00.139780+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80879616 unmapped: 786432 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:01.139972+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80887808 unmapped: 778240 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:02.140195+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80887808 unmapped: 778240 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:03.140395+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 80904192 unmapped: 761856 heap: 81666048 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925774 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:04.140734+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 81952768 unmapped: 761856 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:05.140888+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 81952768 unmapped: 761856 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Cumulative writes: 7179 writes, 30K keys, 7179 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s
                                           Cumulative WAL: 7179 writes, 1333 syncs, 5.39 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 7179 writes, 30K keys, 7179 commit groups, 1.0 writes per commit group, ingest: 20.58 MB, 0.03 MB/s
                                           Interval WAL: 7179 writes, 1333 syncs, 5.39 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 0.000158 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:06.141047+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82034688 unmapped: 679936 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.904463768s of 12.073836327s, submitted: 11
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:07.141198+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82034688 unmapped: 679936 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:08.141377+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82034688 unmapped: 679936 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925167 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:09.141566+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82042880 unmapped: 671744 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:10.141781+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82042880 unmapped: 671744 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:11.142072+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82051072 unmapped: 663552 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:12.142311+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82067456 unmapped: 647168 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:13.142480+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82067456 unmapped: 647168 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:14.142664+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 638976 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:15.142849+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 638976 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:16.143036+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 630784 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:17.143176+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 630784 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:18.143346+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82092032 unmapped: 622592 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:19.143509+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82100224 unmapped: 614400 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:20.143700+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82100224 unmapped: 614400 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:21.143867+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82108416 unmapped: 606208 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:22.144055+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82108416 unmapped: 606208 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:23.144272+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82108416 unmapped: 606208 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:24.144439+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82116608 unmapped: 598016 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:25.144601+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82116608 unmapped: 598016 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:26.144762+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82124800 unmapped: 589824 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:27.144920+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82124800 unmapped: 589824 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:28.145076+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82124800 unmapped: 589824 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:29.145327+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 581632 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:30.145539+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 581632 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:31.145695+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82141184 unmapped: 573440 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:32.145904+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82141184 unmapped: 573440 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:33.146029+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82149376 unmapped: 565248 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:34.146204+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 557056 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:35.146361+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 557056 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:36.146504+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82165760 unmapped: 548864 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:37.146667+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82165760 unmapped: 548864 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:38.146818+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82165760 unmapped: 548864 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:39.146990+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82173952 unmapped: 540672 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:40.147217+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82173952 unmapped: 540672 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:41.147379+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 532480 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:42.147559+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 532480 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:43.147712+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb25ef5400 session 0x55fb240be000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82190336 unmapped: 524288 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:44.147863+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82190336 unmapped: 524288 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:45.148031+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82190336 unmapped: 524288 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:46.148235+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82190336 unmapped: 524288 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:47.148403+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82198528 unmapped: 516096 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:48.148667+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82198528 unmapped: 516096 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:49.148836+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 507904 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:50.149049+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 507904 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:51.149236+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82214912 unmapped: 499712 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:52.149438+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82214912 unmapped: 499712 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:53.149591+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82223104 unmapped: 491520 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 47.327915192s of 47.335025787s, submitted: 2
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:54.149759+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 507904 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:55.149951+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 507904 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:56.150198+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 507904 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:57.150453+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82223104 unmapped: 491520 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:58.150658+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82239488 unmapped: 475136 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925183 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:49:59.150813+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82239488 unmapped: 475136 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:00.151241+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82247680 unmapped: 466944 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:01.151444+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 458752 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:02.151610+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 458752 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:03.151825+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 458752 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925183 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:04.152049+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 450560 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:05.152213+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 450560 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:06.152447+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 442368 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:07.152651+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 442368 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:08.152832+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 434176 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925183 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:09.153129+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.530076981s of 15.585687637s, submitted: 10
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 425984 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:10.153497+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 425984 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:11.153774+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 417792 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:12.153982+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 417792 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:13.154239+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 401408 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:14.154474+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82313216 unmapped: 401408 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:15.154742+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82321408 unmapped: 393216 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:16.155036+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82321408 unmapped: 393216 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:17.155268+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82321408 unmapped: 393216 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:18.155446+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82329600 unmapped: 385024 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:19.155760+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82329600 unmapped: 385024 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:20.156053+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82329600 unmapped: 385024 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:21.156250+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 376832 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:22.156487+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82337792 unmapped: 376832 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:23.156684+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82345984 unmapped: 368640 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:24.156829+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82345984 unmapped: 368640 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:25.157014+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 360448 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:26.157224+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 360448 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:27.157397+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82354176 unmapped: 360448 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:28.157541+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82362368 unmapped: 352256 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:29.157722+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82362368 unmapped: 352256 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:30.157948+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82370560 unmapped: 344064 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:31.158135+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82370560 unmapped: 344064 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:32.158339+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 335872 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:33.158570+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 335872 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:34.158719+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82378752 unmapped: 335872 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:35.158914+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82386944 unmapped: 327680 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:36.159075+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82386944 unmapped: 327680 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:37.159348+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82386944 unmapped: 327680 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:38.159546+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82395136 unmapped: 319488 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:39.159694+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82395136 unmapped: 319488 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:40.159935+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82395136 unmapped: 319488 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:41.160223+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82403328 unmapped: 311296 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:42.160385+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82403328 unmapped: 311296 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:43.160614+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 303104 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:44.160763+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 303104 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:45.160946+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82411520 unmapped: 303104 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:46.161207+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82419712 unmapped: 294912 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:47.161414+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82419712 unmapped: 294912 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:48.161587+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82427904 unmapped: 286720 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:49.161750+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82427904 unmapped: 286720 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:50.161981+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82436096 unmapped: 278528 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:51.162194+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82436096 unmapped: 278528 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:52.162441+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82436096 unmapped: 278528 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:53.162598+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82444288 unmapped: 270336 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:54.163220+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82444288 unmapped: 270336 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:55.163487+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82444288 unmapped: 270336 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:56.163688+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82452480 unmapped: 262144 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:57.164250+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 253952 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:58.164469+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 245760 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:50:59.164799+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 245760 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:00.164995+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 237568 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:01.165284+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 237568 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:02.165430+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 237568 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:03.165645+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 229376 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:04.165827+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 229376 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:05.165977+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 221184 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:06.166197+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 221184 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:07.166396+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 221184 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:08.166556+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 212992 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:09.166703+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 212992 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:10.166994+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 212992 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:11.167180+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 212992 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:12.167304+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:13.167586+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:14.167753+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:15.167927+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:16.168127+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:17.168314+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:18.168548+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:19.168711+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:20.168953+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:21.169109+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:22.169240+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:23.169442+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:24.169606+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:25.169830+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:26.170126+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:27.170343+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:28.170520+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:29.170698+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 204800 heap: 82714624 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:30.170971+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 80.756385803s of 80.760154724s, submitted: 1
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1163264 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:31.171222+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82616320 unmapped: 1146880 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:32.171378+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82616320 unmapped: 1146880 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:33.171562+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb24bd3000 session 0x55fb2719d860
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82919424 unmapped: 843776 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:34.171770+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82919424 unmapped: 843776 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:35.171962+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:36.172277+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:37.172457+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:38.173112+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:39.173345+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:40.173548+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:41.173707+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82935808 unmapped: 827392 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:42.174112+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82935808 unmapped: 827392 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:43.174403+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82935808 unmapped: 827392 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:44.174657+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925035 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.854737282s of 14.505135536s, submitted: 221
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:45.174837+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:46.175047+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:47.175309+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82927616 unmapped: 835584 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:48.175517+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82944000 unmapped: 819200 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:49.175733+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 925183 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82944000 unmapped: 819200 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:50.176058+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82952192 unmapped: 811008 heap: 83763200 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:51.176313+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 778240 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:52.176450+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84033536 unmapped: 778240 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:53.176625+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82984960 unmapped: 1826816 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:54.176782+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926695 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82984960 unmapped: 1826816 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:55.176927+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82984960 unmapped: 1826816 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:56.177082+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82993152 unmapped: 1818624 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:57.177223+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:58.177364+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.295805931s of 14.345085144s, submitted: 11
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:51:59.177516+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82993152 unmapped: 1818624 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926395 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:00.177848+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82993152 unmapped: 1818624 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:01.177986+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82993152 unmapped: 1818624 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:02.178169+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 82993152 unmapped: 1818624 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:03.178299+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:04.178433+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:05.178571+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:06.178859+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:07.179077+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:08.179369+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:09.179558+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:10.179840+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:11.180017+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83001344 unmapped: 1810432 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:12.180263+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83009536 unmapped: 1802240 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:13.180545+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83009536 unmapped: 1802240 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:14.180706+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83009536 unmapped: 1802240 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:15.180911+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:16.260534+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:17.260674+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:18.260883+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:19.261058+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:20.261245+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:21.261473+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:22.261693+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:23.261834+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:24.261997+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:25.262248+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:26.262522+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:27.262699+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:28.262942+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:29.263122+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:30.263425+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:31.263641+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:32.263879+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:33.264115+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:34.264339+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:35.264549+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:36.264711+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:37.264920+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:38.265195+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:39.265449+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:40.265656+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:41.265902+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:42.266101+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83017728 unmapped: 1794048 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:43.266342+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:44.266545+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:45.266807+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:46.267026+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:47.267230+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:48.267403+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:49.267620+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:50.267887+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:51.268180+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:52.268390+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26ae6780
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:53.268556+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:54.268755+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:55.268974+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:56.269129+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:57.269356+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:58.269526+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:52:59.269827+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926547 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:00.270164+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:01.270378+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:02.270588+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83025920 unmapped: 1785856 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:03.270799+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 64.185989380s of 64.189659119s, submitted: 1
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 1777664 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:04.271232+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 1777664 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 926679 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:05.271387+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 1777664 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:06.271549+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 1777664 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:07.271712+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 1777664 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:08.271864+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83034112 unmapped: 1777664 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:09.272019+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83050496 unmapped: 1761280 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928207 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:10.272215+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83050496 unmapped: 1761280 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:11.272370+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83050496 unmapped: 1761280 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:12.272506+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83066880 unmapped: 1744896 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:13.272650+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83066880 unmapped: 1744896 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:14.272833+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83066880 unmapped: 1744896 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.424573898s of 11.459362030s, submitted: 10
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927907 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:15.273004+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:16.273166+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:17.273307+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:18.273484+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:19.273689+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:20.273920+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:21.274120+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:22.274310+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:23.274523+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:24.274742+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:25.274942+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:26.275197+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:27.275380+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:28.275608+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:29.275875+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:30.276098+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:31.276299+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:32.276448+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:33.276629+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:34.276818+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:35.278213+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:36.279132+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:37.279986+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:38.280599+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:39.281290+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:40.281514+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:41.281725+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:42.282269+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:43.282795+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:44.283225+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:45.283384+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:46.284263+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:47.284457+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:48.284648+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:49.284896+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb25ef5800 session 0x55fb25e2b0e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:50.285091+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:51.285388+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:52.285544+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:53.285761+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:54.285945+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:55.286134+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:56.286412+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:57.286588+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:58.286765+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:53:59.286952+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 927468 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:00.287271+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83075072 unmapped: 1736704 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 46.125885010s of 46.133140564s, submitted: 2
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:01.287463+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83083264 unmapped: 1728512 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:02.287666+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83083264 unmapped: 1728512 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:03.288487+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83083264 unmapped: 1728512 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:04.288680+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929128 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:05.288847+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:06.289019+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:07.289258+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:08.289445+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:09.289625+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928369 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:10.289838+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:11.290007+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:12.290244+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:13.290426+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.988737106s of 13.028569221s, submitted: 12
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:14.290735+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:15.290956+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:16.291260+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:17.291508+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:18.291662+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:19.291893+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:20.292134+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:21.292395+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:22.292588+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:23.292801+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:24.292967+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:25.293129+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:26.293290+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:27.293531+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:28.293701+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:29.293820+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:30.294001+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417d400 session 0x55fb252a5680
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:31.294354+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:32.294555+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:33.294835+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:34.295044+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:35.295277+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:36.295468+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:37.295692+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:38.295878+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:39.296066+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:40.296255+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:41.296411+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 27.333505630s of 27.336708069s, submitted: 1
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:42.296539+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83124224 unmapped: 1687552 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:43.296675+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83124224 unmapped: 1687552 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:44.296816+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [0,2])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 1662976 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:45.296952+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930049 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 1662976 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:46.297113+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 1662976 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:47.297273+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:48.297438+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:49.297627+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:50.297860+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930049 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:51.298038+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:52.298245+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:53.298366+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:54.298551+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.960668564s of 13.129460335s, submitted: 12
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:55.298722+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:56.298896+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:57.299065+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:58.299273+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:59.299493+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:00.299719+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:01.299882+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:02.300055+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:03.300279+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:04.300433+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:05.300620+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:06.300766+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:07.300924+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:08.301111+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:09.301236+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:10.301453+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:11.301581+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:12.301728+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:13.301966+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:14.302117+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:15.302311+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:16.302468+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:17.302603+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:18.302768+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:19.302983+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:20.303218+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:21.303367+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:22.303519+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:23.303656+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:24.303799+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:25.303958+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:26.304087+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:27.304239+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:28.304383+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:29.304535+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:30.304744+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:31.304916+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417cc00 session 0x55fb271bda40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:32.305085+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:33.305269+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:34.305446+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:35.305620+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:36.305846+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:37.306034+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:38.306205+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:39.306395+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:40.306846+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:41.307062+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:42.307259+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd2400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 48.355422974s of 48.445949554s, submitted: 1
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:43.307647+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:44.307846+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:45.308028+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929442 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb25ef5400 session 0x55fb26d65860
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:46.308263+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:47.308432+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:48.308615+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:49.308905+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:50.309097+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930970 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:51.309315+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:52.315094+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:53.315269+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:54.315431+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:55.315678+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930970 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:56.315848+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.050199509s of 14.091516495s, submitted: 5
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:57.316080+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:58.316881+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83279872 unmapped: 1531904 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:59.317071+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83279872 unmapped: 1531904 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:00.317446+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83296256 unmapped: 1515520 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930379 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:01.317744+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83296256 unmapped: 1515520 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:02.318037+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83296256 unmapped: 1515520 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:03.318283+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:04.318548+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:05.318733+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931891 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:06.318937+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:07.319164+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:08.319410+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.031768799s of 12.082296371s, submitted: 15
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:09.319553+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83329024 unmapped: 1482752 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:10.322383+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83329024 unmapped: 1482752 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931284 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:11.322520+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:12.322698+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:13.322884+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:14.323088+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:15.323330+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:16.323502+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:17.323683+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:18.323864+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:19.323992+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:20.324182+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:21.324322+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:22.324469+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:23.324698+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:24.324899+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:25.325045+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:26.325716+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:27.325878+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:28.326018+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:29.326193+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:30.326573+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:31.326719+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:32.326864+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:33.327078+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:34.327214+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:35.327392+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:36.327550+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:37.327708+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:38.327846+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:39.327989+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:40.328196+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:41.328350+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:42.328515+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:43.328669+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:44.328913+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:45.329532+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:46.329703+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:47.329854+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:48.330262+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:49.330418+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:50.331423+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:51.332286+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:52.332451+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:53.332626+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:54.332800+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:55.332978+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:56.333179+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:57.333640+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:58.334062+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:59.334214+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:00.334416+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:01.334781+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:02.334932+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:03.335247+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:04.335428+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26f585a0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:05.335732+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:06.335979+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:07.336137+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:08.336302+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:09.336571+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:10.336794+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:11.337044+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:12.337219+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:13.337389+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:14.337726+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 66.122673035s of 66.133468628s, submitted: 3
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:15.337861+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931284 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:16.338041+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:17.338196+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:18.338342+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:19.338539+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:20.338768+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931300 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:21.338970+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:22.339176+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:23.339329+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:24.339522+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:25.339681+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931300 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:26.339848+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:27.340050+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.004903793s of 12.174333572s, submitted: 10
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:28.340235+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:29.340370+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:30.340593+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930693 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:31.340748+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:32.340879+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:33.341043+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:34.341224+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:35.341375+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:36.341526+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:37.341689+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:38.341842+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:39.341972+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:40.342180+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:41.342337+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:42.342554+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:43.342729+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:44.342965+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:45.343203+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:46.343490+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:47.343676+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:48.343899+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:49.344056+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:50.344270+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:51.344442+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:52.344620+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:53.344806+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:54.344945+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:55.345132+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:56.345334+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:57.345535+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:58.345698+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:59.345883+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:00.346069+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:01.346225+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:02.346423+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:03.346603+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:04.346835+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:05.347071+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:06.347281+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:07.347463+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:08.347618+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:09.347784+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:10.347984+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:11.348253+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:12.348436+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:13.348660+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:14.348810+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:15.348943+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:16.349088+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:17.349284+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:18.349461+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:19.349593+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:20.349794+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:21.349980+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:22.350195+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:23.350361+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:24.350485+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:25.350620+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:26.350813+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:27.350939+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:28.351085+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:29.351318+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:30.351632+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:31.351834+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:32.351973+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:33.352095+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:34.352226+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:35.352410+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:36.352564+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:37.352721+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:38.352895+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:39.353064+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:40.353213+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:41.353335+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:42.353514+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:43.353679+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:44.353857+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:45.353986+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:46.354117+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:47.354261+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:48.354414+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:49.354553+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:50.354817+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:51.354981+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:52.355119+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:53.355225+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:54.355399+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:55.355574+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:56.355742+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:57.355911+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:58.356078+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:59.356227+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:00.356486+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:01.356641+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:02.356863+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:03.357012+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:04.357175+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:05.357348+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 7804 writes, 31K keys, 7804 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 7804 writes, 1639 syncs, 4.76 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 625 writes, 1051 keys, 625 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s
                                           Interval WAL: 625 writes, 306 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:06.357561+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:07.357712+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:08.357891+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:09.358052+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:10.358257+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:11.358385+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:12.358513+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:13.358689+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:14.358867+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:15.359079+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26b301e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:16.359290+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:17.359475+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:18.359637+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:19.359802+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:20.360023+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:21.360183+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:22.360364+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:23.360525+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:24.360698+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:25.360827+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:26.360984+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 119.449485779s of 119.461830139s, submitted: 2
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:27.361111+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:28.361224+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:29.361399+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:30.361631+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:31.361799+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932221 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:32.361949+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:33.362138+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:34.362277+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:35.362398+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:36.362516+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932221 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:37.362672+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:38.362809+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:39.362973+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:40.363138+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.086344719s of 14.123706818s, submitted: 11
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:41.363281+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932089 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:42.363654+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:43.363790+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:44.363951+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:45.364100+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:46.364276+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:47.364410+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:48.364544+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:49.364698+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:50.364936+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:51.365063+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:52.365203+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:53.365335+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:54.365495+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:55.365667+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:56.365818+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread fragmentation_score=0.000028 took=0.000254s
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:57.366025+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:58.366254+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:59.366408+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:00.366682+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:01.366897+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:02.367119+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:03.367372+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:04.367553+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:05.367733+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:06.367925+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:07.368075+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:08.368251+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:09.368521+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:10.368731+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:11.368926+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:12.369220+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:13.369393+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:14.369673+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:15.369845+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:16.370047+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:17.370232+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:18.370420+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:19.370628+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:20.370888+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:21.371113+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:22.371387+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:23.371572+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:24.371900+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:25.372131+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:26.372370+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:27.372586+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:28.372759+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:29.372916+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:30.373199+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:31.373583+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:32.374062+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:33.374463+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:34.374881+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:35.375228+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:36.375467+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:37.375758+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:38.376043+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:39.376256+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:40.376462+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:41.376647+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:42.376814+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:43.377051+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:44.377263+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:45.377417+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:46.377584+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:47.377722+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:48.377955+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:49.378214+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:50.378455+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:51.378642+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:52.378842+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:53.379058+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:54.379240+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:55.379435+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:56.379565+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:57.379718+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:58.379979+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:59.380256+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:00.380517+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:01.380720+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417d400 session 0x55fb26f51a40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:02.381222+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:03.381367+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:04.381505+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:05.381651+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:06.381812+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:07.381974+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:08.382133+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:09.382350+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:10.382537+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:11.382716+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:12.382874+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 90.763069153s of 91.630355835s, submitted: 1
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:13.383056+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:14.383214+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:15.383366+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:16.383509+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932221 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:17.383628+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:18.383778+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:19.383973+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:20.384204+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:21.384343+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932974 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:22.384539+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:23.384715+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:24.384865+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:25.385048+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:26.385203+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.195486069s of 14.243807793s, submitted: 12
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:27.385345+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:28.385485+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:29.385683+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:30.385873+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83615744 unmapped: 1196032 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:31.385995+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,1])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 podman[240100]: 2025-12-06 10:17:30.770478406 +0000 UTC m=+0.059075642 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:32.386199+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:33.386378+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:34.386514+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:35.386711+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:36.386875+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:37.387041+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:38.387209+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:39.387358+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:40.388999+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:41.389191+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:42.389351+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:43.389512+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:44.389696+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:45.389858+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:46.389997+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:47.390237+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:48.390400+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:49.390558+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:50.390769+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:51.390966+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:52.391165+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:53.391379+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:54.391544+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:55.391781+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:56.391936+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:57.392226+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:58.392382+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:59.392548+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:00.392771+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:01.393006+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:02.393297+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:03.393455+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:04.393693+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:05.393926+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:06.394103+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:07.394263+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:08.394419+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:09.394645+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:10.394912+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:11.395171+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:12.395325+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:13.395491+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:14.395618+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417cc00 session 0x55fb268f6000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:15.395801+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:16.395944+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:17.396127+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:18.396372+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:19.396509+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:20.396709+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:21.396877+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:22.397090+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:23.397240+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:24.397406+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 57.946811676s of 58.467189789s, submitted: 205
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:25.397605+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:26.397785+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933126 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:27.397961+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:28.398135+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:29.398328+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:30.398514+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:31.398658+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936166 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:32.398823+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:33.398978+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:34.399124+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:35.399358+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:36.399513+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935407 data_alloc: 218103808 data_used: 135168
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:37.399698+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:38.399827+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:39.399999+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.869669914s of 14.905448914s, submitted: 12
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:40.400235+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:41.400375+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:42.400513+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:43.400784+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:44.400924+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:45.401112+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:46.401264+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:47.401456+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:48.401650+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:49.401852+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:50.402115+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:51.402283+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:52.402459+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:53.402650+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:54.402828+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:55.402973+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:56.403193+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:57.403371+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:58.403506+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:59.403653+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:00.403851+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:01.404023+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:02.404210+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:03.404360+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:04.404517+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:05.404649+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:06.404783+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:07.405041+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:08.405198+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:09.405332+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:10.405547+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:11.407332+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:12.408190+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:13.408573+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:14.410101+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:15.410931+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:16.411241+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:17.411453+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:18.411604+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:19.411845+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:20.412103+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:21.412303+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:22.412565+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:23.412794+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:24.412985+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:25.413179+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:26.413383+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:27.413605+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:28.413835+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:29.414131+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:30.414563+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:31.414891+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:32.415046+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:33.415237+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:34.415404+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:35.415566+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:36.415773+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:37.415920+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:38.416096+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:39.416277+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:40.416518+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:41.416683+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:42.416879+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:43.417044+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:44.417247+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:45.417417+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:46.417605+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:47.417706+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:48.417836+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:49.417968+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:50.418198+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:51.418392+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:52.418539+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:53.418701+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:54.418850+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:55.419040+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:56.419203+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:57.419392+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:58.419505+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:59.419677+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:00.419894+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:01.420058+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:02.420212+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:03.420462+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:04.420596+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:05.420824+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:06.421022+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:07.421253+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:08.421440+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:09.421660+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:10.421874+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:11.422045+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:12.422234+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:13.422364+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 93.416572571s of 93.421234131s, submitted: 1
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:14.422527+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b43400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _renew_subs
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:15.422658+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 148 heartbeat osd_stat(store_statfs(0x4fc1e1000/0x0/0x4ffc00000, data 0x574248/0x62a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83984384 unmapped: 10141696 heap: 94126080 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 149 ms_handle_reset con 0x55fb26b43400 session 0x55fb26f9b2c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:16.422783+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b43400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84090880 unmapped: 18432000 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:17.422943+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1056837 data_alloc: 218103808 data_used: 143360
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 150 ms_handle_reset con 0x55fb26b43400 session 0x55fb26c185a0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:18.423127+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 150 heartbeat osd_stat(store_statfs(0x4fb56a000/0x0/0x4ffc00000, data 0x11e847b/0x12a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:19.423349+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:20.423680+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 150 heartbeat osd_stat(store_statfs(0x4fb56a000/0x0/0x4ffc00000, data 0x11e847b/0x12a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:21.423915+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:22.424128+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1062395 data_alloc: 218103808 data_used: 143360
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:23.424360+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:24.424575+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:25.424818+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84115456 unmapped: 18407424 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:26.425034+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84115456 unmapped: 18407424 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:27.425226+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:28.425379+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:29.425534+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:30.425754+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:31.425957+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:32.426102+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:33.426264+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:34.426526+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:35.426698+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 ms_handle_reset con 0x55fb24bd2400 session 0x55fb26acb4a0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:36.426872+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:37.427055+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:38.427222+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:39.427404+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:40.427610+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26db8d20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:41.427787+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:42.427981+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:43.428260+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:44.428437+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:45.428633+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:46.428802+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 33.286060333s of 33.423255920s, submitted: 40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:47.429011+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1064445 data_alloc: 218103808 data_used: 143360
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:48.429233+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:49.429399+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:50.429672+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:51.429869+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:52.430038+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1064593 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:53.430268+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84172800 unmapped: 18350080 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:54.430460+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84172800 unmapped: 18350080 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:55.430619+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84172800 unmapped: 18350080 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:56.430781+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:57.430976+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1063986 data_alloc: 218103808 data_used: 143360
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:58.431223+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:59.431393+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.490637779s of 12.528245926s, submitted: 11
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:00.431667+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:01.431837+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:02.432022+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1063702 data_alloc: 218103808 data_used: 139264
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:03.432295+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:04.432488+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 ms_handle_reset con 0x55fb2417d400 session 0x55fb268d2b40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 88793088 unmapped: 13729792 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:05.432813+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 151 handle_osd_map epochs [151,152], i have 151, src has [1,152]
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 89841664 unmapped: 12681216 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 152 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:06.433018+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _renew_subs
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 152 handle_osd_map epochs [153,153], i have 152, src has [1,153]
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90734592 unmapped: 11788288 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 153 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb240bc3c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:07.433240+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90210304 unmapped: 12312576 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1113326 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:08.433505+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90243072 unmapped: 12279808 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:09.433725+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90243072 unmapped: 12279808 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 153 heartbeat osd_stat(store_statfs(0x4fb1bc000/0x0/0x4ffc00000, data 0x1592679/0x164e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:10.433934+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.364974976s of 10.899864197s, submitted: 40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 153 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24f02b40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90275840 unmapped: 12247040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:11.434108+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90275840 unmapped: 12247040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:12.434284+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1138423 data_alloc: 218103808 data_used: 8523776
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:13.434459+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:14.434617+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _renew_subs
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 153 handle_osd_map epochs [154,154], i have 153, src has [1,154]
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:15.434769+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:16.434932+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb268d3e00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:17.435209+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1142141 data_alloc: 218103808 data_used: 8527872
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:18.435365+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:19.435705+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:20.436020+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:21.437500+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:22.438211+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1142141 data_alloc: 218103808 data_used: 8527872
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.835918427s of 12.857731819s, submitted: 18
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:23.438585+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,1])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 96157696 unmapped: 6365184 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:24.439057+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fad9d000/0x0/0x4ffc00000, data 0x19a366e/0x1a61000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 99008512 unmapped: 3514368 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:25.439486+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 99008512 unmapped: 3514368 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9be3000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:26.439723+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 99008512 unmapped: 3514368 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:27.439902+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175489 data_alloc: 218103808 data_used: 8544256
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:28.440665+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:29.441251+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:30.441650+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:31.442090+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:32.443058+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175673 data_alloc: 218103808 data_used: 8540160
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:33.443649+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.407831192s of 10.563117027s, submitted: 55
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:34.443977+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:35.444533+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:36.444937+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:37.445253+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175505 data_alloc: 218103808 data_used: 8540160
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:38.445488+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:39.445722+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:40.445929+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:41.446079+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:42.446316+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175205 data_alloc: 218103808 data_used: 8540160
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:43.446523+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:44.446772+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b43400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43400 session 0x55fb271ad0e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:45.447080+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b43800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.870066643s of 11.887675285s, submitted: 5
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43800 session 0x55fb271ac000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97779712 unmapped: 13139968 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:46.447321+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97779712 unmapped: 13139968 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:47.447532+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92da000/0x0/0x4ffc00000, data 0x22d466e/0x2392000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243431 data_alloc: 218103808 data_used: 8544256
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:48.447754+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:49.448061+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:50.448375+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b43c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43c00 session 0x55fb26a99e00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:51.448547+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92da000/0x0/0x4ffc00000, data 0x22d466e/0x2392000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:52.448689+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb23791e00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243431 data_alloc: 218103808 data_used: 8544256
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97861632 unmapped: 13058048 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:53.448856+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26aca1e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b43400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43400 session 0x55fb2422b860
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97607680 unmapped: 13312000 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:54.449062+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b6000/0x0/0x4ffc00000, data 0x22f866e/0x23b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b43800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b6000/0x0/0x4ffc00000, data 0x22f866e/0x23b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97607680 unmapped: 13312000 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:55.449220+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101277696 unmapped: 9641984 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:56.449400+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105553920 unmapped: 5365760 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:57.449551+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311187 data_alloc: 234881024 data_used: 17436672
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:58.449683+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:59.449834+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:00.450037+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:01.450233+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:02.450369+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311187 data_alloc: 234881024 data_used: 17436672
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:03.450472+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:04.450636+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:05.450735+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb24bb70e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:06.450846+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.669349670s of 20.780309677s, submitted: 12
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109428736 unmapped: 3588096 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:07.451013+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383089 data_alloc: 234881024 data_used: 18948096
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:08.451188+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:09.451406+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aff000/0x0/0x4ffc00000, data 0x2aaf66e/0x2b6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:10.451621+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:11.451779+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:12.451950+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aff000/0x0/0x4ffc00000, data 0x2aaf66e/0x2b6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383089 data_alloc: 234881024 data_used: 18948096
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:13.452109+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aff000/0x0/0x4ffc00000, data 0x2aaf66e/0x2b6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:14.452288+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:15.452624+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:16.452893+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42400 session 0x55fb26d654a0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43800 session 0x55fb26da5c20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:17.453096+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.124892235s of 11.262226105s, submitted: 63
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25c3c400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb26a990e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1183609 data_alloc: 218103808 data_used: 7954432
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:18.453303+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf8000/0x0/0x4ffc00000, data 0x19b666e/0x1a74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:19.453471+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26a96f00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf8000/0x0/0x4ffc00000, data 0x19b666e/0x1a74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:20.453712+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:21.453886+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:22.454052+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf8000/0x0/0x4ffc00000, data 0x19b666e/0x1a74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1183625 data_alloc: 218103808 data_used: 7950336
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:23.454239+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26da4780
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb23f50000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb23f50000 session 0x55fb26f39c20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:24.454396+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:25.454581+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:26.454761+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:27.454938+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1099353 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:28.455047+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:29.455219+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:30.455418+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.781655312s of 12.873902321s, submitted: 32
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100900864 unmapped: 12115968 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:31.455588+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100900864 unmapped: 12115968 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:32.455693+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1099077 data_alloc: 218103808 data_used: 4792320
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100909056 unmapped: 12107776 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:33.455823+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:34.455976+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:35.456119+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:36.456258+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:37.456443+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100589 data_alloc: 218103808 data_used: 4792320
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:38.456616+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:39.456769+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:40.456966+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:41.457117+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:42.457321+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.064629555s of 12.113365173s, submitted: 14
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100573 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:43.457457+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:44.457656+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:45.457838+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:46.458033+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:47.458249+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100441 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:48.458399+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:49.458580+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:50.458772+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104062976 unmapped: 13164544 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26649680
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:51.458926+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:52.459217+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d46000/0x0/0x4ffc00000, data 0x186964b/0x1926000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152691 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:53.459432+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d46000/0x0/0x4ffc00000, data 0x186964b/0x1926000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:54.459571+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:55.459803+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100933632 unmapped: 16293888 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25c3c400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.373836517s of 13.440272331s, submitted: 18
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb240be1e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:56.459989+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100974592 unmapped: 16252928 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d46000/0x0/0x4ffc00000, data 0x186964b/0x1926000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:57.460217+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100982784 unmapped: 16244736 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1179880 data_alloc: 218103808 data_used: 8491008
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:58.460391+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101285888 unmapped: 15941632 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:59.460617+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d45000/0x0/0x4ffc00000, data 0x186966e/0x1927000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:00.460863+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:01.461049+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:02.461244+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:03.461396+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192648 data_alloc: 234881024 data_used: 10371072
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d45000/0x0/0x4ffc00000, data 0x186966e/0x1927000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:04.461589+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:05.461751+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:06.461893+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d45000/0x0/0x4ffc00000, data 0x186966e/0x1927000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:07.462098+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:08.462258+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192648 data_alloc: 234881024 data_used: 10371072
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.659880638s of 12.670410156s, submitted: 4
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:09.462443+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106905600 unmapped: 10321920 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:10.462670+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:11.462852+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9049000/0x0/0x4ffc00000, data 0x255c66e/0x261a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:12.463020+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:13.463241+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1313916 data_alloc: 234881024 data_used: 12308480
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:14.463390+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9049000/0x0/0x4ffc00000, data 0x255c66e/0x261a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:15.463527+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:16.463736+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:17.463902+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:18.464085+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1306724 data_alloc: 234881024 data_used: 12316672
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:19.464244+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:20.464426+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f904f000/0x0/0x4ffc00000, data 0x255f66e/0x261d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:21.464579+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:22.464749+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:23.464905+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1306724 data_alloc: 234881024 data_used: 12316672
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:24.465042+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f904f000/0x0/0x4ffc00000, data 0x255f66e/0x261d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:25.465223+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:26.465363+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:27.465547+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:28.465704+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1306724 data_alloc: 234881024 data_used: 12316672
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:29.465879+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:30.466086+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f904f000/0x0/0x4ffc00000, data 0x255f66e/0x261d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:31.466267+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106815488 unmapped: 10412032 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:32.466448+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106815488 unmapped: 10412032 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:33.466626+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308244 data_alloc: 234881024 data_used: 12402688
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106823680 unmapped: 10403840 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:34.466769+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 25.410940170s of 25.654003143s, submitted: 125
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 10395648 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26da41e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:35.466896+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb24e9bc20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:36.467048+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:37.467198+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:38.467374+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:39.467715+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:40.467935+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:41.468100+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:42.468265+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:43.468416+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:44.468567+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:45.468775+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:46.468986+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:47.469191+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:48.469389+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:49.469543+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:50.469758+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:51.469957+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:52.470165+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:53.470308+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:54.470475+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:55.470688+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:56.470871+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:57.471110+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:58.471271+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:59.471419+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:00.471608+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:01.471747+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:02.471940+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26c92000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26c92000 session 0x55fb2422bc20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26acb2c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26acb860
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25c3c400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb26acb4a0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 28.730512619s of 28.779657364s, submitted: 23
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:03.472067+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26acaf00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24e881e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26aca1e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb24e88d20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25c3c400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163827 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb24e89c20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:04.472212+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:05.472371+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:06.472503+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26b30000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb261b6c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb261b6c00 session 0x55fb26b301e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:07.472641+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26b30780
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103383040 unmapped: 22241280 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:08.472833+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163827 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103383040 unmapped: 22241280 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:09.473021+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103383040 unmapped: 22241280 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24f02b40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb268fe1e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:10.473270+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25c3c400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103391232 unmapped: 22233088 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:11.473430+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103391232 unmapped: 22233088 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:12.473569+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:13.473723+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1195423 data_alloc: 234881024 data_used: 9515008
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:14.473877+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:15.474024+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:16.474215+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:17.474363+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:18.474506+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1195423 data_alloc: 234881024 data_used: 9515008
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104669184 unmapped: 20955136 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:19.474674+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104669184 unmapped: 20955136 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:20.474892+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 20946944 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb261b7800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.621421814s of 17.702753067s, submitted: 25
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:21.475036+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 20946944 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:22.475230+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105635840 unmapped: 19988480 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:23.475449+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1205371 data_alloc: 234881024 data_used: 9601024
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9cc9000/0x0/0x4ffc00000, data 0x18e565b/0x19a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106078208 unmapped: 19546112 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:24.475600+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105332736 unmapped: 20291584 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:25.475804+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105332736 unmapped: 20291584 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c3e000/0x0/0x4ffc00000, data 0x196f65b/0x1a2d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:26.475969+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105332736 unmapped: 20291584 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:27.476098+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105340928 unmapped: 20283392 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:28.476271+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216501 data_alloc: 234881024 data_used: 9588736
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105340928 unmapped: 20283392 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:29.476423+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:30.476615+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:31.476780+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c20000/0x0/0x4ffc00000, data 0x198e65b/0x1a4c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:32.476929+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c20000/0x0/0x4ffc00000, data 0x198e65b/0x1a4c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.623050690s of 11.763713837s, submitted: 49
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:33.477138+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216802 data_alloc: 234881024 data_used: 9592832
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:34.477404+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:35.477640+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:36.477829+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:37.478286+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:38.478524+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c12000/0x0/0x4ffc00000, data 0x199c65b/0x1a5a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1217114 data_alloc: 234881024 data_used: 9592832
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:39.478693+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:40.481371+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25fd8000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb240c32c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb240d5800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb240d5800 session 0x55fb24f023c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb268fc960
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26ae6960
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26da43c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25fd8000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb24bd4f00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e8c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e8c00 session 0x55fb24bd5c20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105783296 unmapped: 23511040 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb24bd5a40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb24bd43c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:41.481603+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f93a0000/0x0/0x4ffc00000, data 0x220d66b/0x22cc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105783296 unmapped: 23511040 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:42.481749+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:43.482230+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284570 data_alloc: 234881024 data_used: 9592832
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:44.482636+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939d000/0x0/0x4ffc00000, data 0x221066b/0x22cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:45.483092+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:46.483607+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:47.484073+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:48.484413+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1316490 data_alloc: 234881024 data_used: 14340096
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 17719296 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:49.484719+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112599040 unmapped: 16695296 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:50.485045+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939d000/0x0/0x4ffc00000, data 0x221066b/0x22cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112623616 unmapped: 16670720 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:51.485455+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112623616 unmapped: 16670720 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:52.485784+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939d000/0x0/0x4ffc00000, data 0x221066b/0x22cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 16637952 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:53.486015+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1342938 data_alloc: 234881024 data_used: 18280448
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 16637952 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:54.486412+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.930198669s of 22.021116257s, submitted: 16
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 16637952 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:55.486608+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939b000/0x0/0x4ffc00000, data 0x221166b/0x22d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112689152 unmapped: 16605184 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:56.486814+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112689152 unmapped: 16605184 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:57.486982+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112689152 unmapped: 16605184 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:58.487162+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1343466 data_alloc: 234881024 data_used: 18317312
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112803840 unmapped: 16490496 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:59.487283+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939b000/0x0/0x4ffc00000, data 0x221166b/0x22d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114663424 unmapped: 14630912 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:00.487427+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 13910016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:01.487576+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 13901824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:02.487779+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 13836288 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:03.488009+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1362194 data_alloc: 234881024 data_used: 18333696
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9264000/0x0/0x4ffc00000, data 0x233366b/0x23f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 13836288 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:04.488194+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb261b7800 session 0x55fb25e2b860
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 13836288 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9264000/0x0/0x4ffc00000, data 0x233366b/0x23f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:05.488445+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 9251 writes, 35K keys, 9251 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s
                                           Cumulative WAL: 9251 writes, 2253 syncs, 4.11 writes per sync, written: 0.03 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1447 writes, 4631 keys, 1447 commit groups, 1.0 writes per commit group, ingest: 5.55 MB, 0.01 MB/s
                                           Interval WAL: 1447 writes, 614 syncs, 2.36 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.082611084s of 11.189125061s, submitted: 41
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 13803520 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:06.488751+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 13803520 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:07.489007+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 13803520 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:08.489255+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24eda960
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1362210 data_alloc: 234881024 data_used: 18333696
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25fd8000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114360320 unmapped: 14934016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:09.489505+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb268fbc20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:10.489815+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c0e000/0x0/0x4ffc00000, data 0x19a065b/0x1a5e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:11.490084+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:12.490358+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:13.490542+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224948 data_alloc: 234881024 data_used: 9592832
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:14.490744+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:15.491007+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb268d2780
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c0e000/0x0/0x4ffc00000, data 0x19a065b/0x1a5e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25fd8000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.954633713s of 10.025353432s, submitted: 24
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb240c14a0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:16.491205+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:17.491466+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:18.491657+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1130496 data_alloc: 218103808 data_used: 4792320
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:19.491871+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:20.492114+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:21.492264+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:22.492437+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:23.492624+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1130496 data_alloc: 218103808 data_used: 4792320
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:24.492778+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:25.492986+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:26.493258+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:27.493431+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:28.493622+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:29.493835+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:30.494033+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:31.494249+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:32.494399+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:33.494535+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:34.494685+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:35.494838+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:36.495035+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:37.495196+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:38.495412+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:39.495585+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:40.495808+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:41.495968+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:42.496236+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:43.496384+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:44.496556+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.085247040s of 29.158304214s, submitted: 25
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26f503c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:45.496726+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9dda000/0x0/0x4ffc00000, data 0x17d564b/0x1892000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:46.496900+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:47.497034+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:48.497235+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1178729 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:49.497397+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24bd74a0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb261b7800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:50.497648+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e9000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106323968 unmapped: 22970368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:51.497851+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:52.498024+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:53.498194+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222909 data_alloc: 234881024 data_used: 10895360
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:54.498334+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:55.498481+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:56.498627+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:57.498798+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:58.499042+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222909 data_alloc: 234881024 data_used: 10895360
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:59.499319+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:00.499583+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:01.499773+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:02.499934+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.328538895s of 17.374874115s, submitted: 11
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114114560 unmapped: 15179776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:03.500125+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297901 data_alloc: 234881024 data_used: 12738560
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:04.500327+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116293632 unmapped: 13000704 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:05.500574+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116301824 unmapped: 12992512 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:06.500755+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116301824 unmapped: 12992512 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:07.500938+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116301824 unmapped: 12992512 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:08.501138+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298205 data_alloc: 234881024 data_used: 12746752
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:09.501404+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:10.501851+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:11.502099+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:12.502250+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:13.502510+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298509 data_alloc: 234881024 data_used: 12754944
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:14.502680+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:15.502828+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:16.503000+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:17.503242+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:18.503484+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298509 data_alloc: 234881024 data_used: 12754944
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:19.503858+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:20.504209+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116391936 unmapped: 12902400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:21.504638+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116391936 unmapped: 12902400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:22.504822+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116391936 unmapped: 12902400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:23.505011+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 12894208 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299269 data_alloc: 234881024 data_used: 12775424
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:24.505220+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 12894208 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:25.505499+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 12894208 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:26.505729+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:27.505975+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:28.506287+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299269 data_alloc: 234881024 data_used: 12775424
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:29.506447+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:30.506666+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb24bd4b40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb24bd5680
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb271ac780
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb271ac000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 28.524518967s of 28.681346893s, submitted: 84
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26c18d20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:31.506892+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25c3c400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb24bd5860
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26b31c20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb24bd43c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26ae72c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:32.507034+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9292000/0x0/0x4ffc00000, data 0x231c65b/0x23da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:33.507245+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:34.507442+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1323404 data_alloc: 234881024 data_used: 12775424
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:35.507632+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114319360 unmapped: 14974976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:36.507800+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114319360 unmapped: 14974976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26f594a0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:37.507988+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25e41400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114319360 unmapped: 14974976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:38.508167+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114360320 unmapped: 14934016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:39.508522+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352973 data_alloc: 234881024 data_used: 16846848
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116736000 unmapped: 12558336 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:40.508702+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116776960 unmapped: 12517376 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:41.508927+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:42.509080+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:43.509240+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:44.509397+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1353581 data_alloc: 234881024 data_used: 16908288
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:45.509549+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:46.509714+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116842496 unmapped: 12451840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:47.509874+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116842496 unmapped: 12451840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:48.510074+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116842496 unmapped: 12451840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.605432510s of 17.711872101s, submitted: 26
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:49.510318+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1406263 data_alloc: 234881024 data_used: 16941056
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 11190272 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:50.510526+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118857728 unmapped: 10436608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:51.510710+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118857728 unmapped: 10436608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8b0f000/0x0/0x4ffc00000, data 0x2a9e67e/0x2b5d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:52.510916+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:53.511089+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:54.511309+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1416385 data_alloc: 234881024 data_used: 16928768
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8b0f000/0x0/0x4ffc00000, data 0x2a9e67e/0x2b5d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:55.511480+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:56.511659+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117432320 unmapped: 11862016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:57.511858+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:58.512053+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:59.512212+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413273 data_alloc: 234881024 data_used: 16928768
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aeb000/0x0/0x4ffc00000, data 0x2ac267e/0x2b81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:00.512435+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.007748604s of 12.300899506s, submitted: 81
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:01.512648+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:02.512923+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e41400 session 0x55fb25e2ba40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26c2a000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:03.513097+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aeb000/0x0/0x4ffc00000, data 0x2ac267e/0x2b81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:04.513279+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298825 data_alloc: 234881024 data_used: 12820480
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:05.513465+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f96a4000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:06.513632+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:07.513812+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:08.513965+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:09.514126+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298993 data_alloc: 234881024 data_used: 12820480
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:10.514822+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f96a4000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:11.514961+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:12.515108+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:13.515319+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb26c192c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb261b7800 session 0x55fb24e9b680
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.708964348s of 12.890141487s, submitted: 37
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb238f32c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:14.515507+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:15.515661+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:16.515917+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:17.516067+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:18.516212+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:19.516399+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:20.516632+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:21.516799+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:22.516955+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e58c00 session 0x55fb24e88780
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25e58c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e59000 session 0x55fb24bb8d20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:23.517122+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:24.517378+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e59c00 session 0x55fb24e890e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:25.517555+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:26.517749+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb24edad20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb261b7800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:27.517934+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:28.518203+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:29.518395+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:30.518579+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.401422501s of 16.425735474s, submitted: 9
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109821952 unmapped: 19472384 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:31.518792+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109920256 unmapped: 19374080 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:32.518924+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb25e2c000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:33.519111+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:34.519261+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:35.519399+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:36.519562+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:37.519790+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:38.519958+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef4800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb26da4b40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:39.520178+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166053 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:40.520386+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:41.520598+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:42.520770+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa220000/0x0/0x4ffc00000, data 0x138f64b/0x144c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:43.520931+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e9000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.438837051s of 13.030948639s, submitted: 230
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109223936 unmapped: 20070400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26f503c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:44.521116+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26f50d20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1165513 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109207552 unmapped: 20086784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb25107680
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb25e2ba40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:45.521279+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109207552 unmapped: 20086784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:46.521477+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa21e000/0x0/0x4ffc00000, data 0x138f67e/0x144e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109305856 unmapped: 19988480 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa21e000/0x0/0x4ffc00000, data 0x138f67e/0x144e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:47.521680+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109305856 unmapped: 19988480 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:48.521829+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109305856 unmapped: 19988480 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:49.522010+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182596 data_alloc: 218103808 data_used: 6209536
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:50.522216+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:51.522383+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:52.522489+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa21e000/0x0/0x4ffc00000, data 0x138f67e/0x144e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb24bb92c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb271ac780
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:53.522610+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb268fd2c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:54.522743+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1153451 data_alloc: 218103808 data_used: 4792320
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:55.522854+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:56.522993+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:57.523190+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21536768 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:58.523311+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21536768 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:59.523446+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1153603 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:00.523655+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.980836868s of 17.077106476s, submitted: 31
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:01.523828+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:02.523993+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:03.524181+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:04.524419+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1153471 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:05.524574+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef4800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 20930560 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb26db9860
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:06.524722+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 20930560 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:07.524861+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 20922368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:08.525097+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 20922368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef4800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb24bd7c20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:09.525234+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162335 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26f501e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 20922368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:10.525418+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa2ea000/0x0/0x4ffc00000, data 0x12c564b/0x1382000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26f50f00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.982028008s of 10.000374794s, submitted: 6
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb243d0960
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 20914176 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:11.525552+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 20914176 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:12.525752+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 20914176 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26db83c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:13.525892+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26f510e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107429888 unmapped: 21864448 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:14.526177+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107438080 unmapped: 21856256 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:15.526315+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:16.526453+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:17.526619+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:18.526858+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:19.527112+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:20.527323+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:21.527475+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:22.527633+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:23.527969+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:24.528236+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:25.528670+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:26.528832+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:27.529024+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:28.529274+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:29.529443+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107454464 unmapped: 21839872 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:30.529656+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:31.529857+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:32.530020+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:33.530211+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:34.530377+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:35.530536+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:36.530738+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:37.530956+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:38.531281+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 21823488 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:39.531454+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.154647827s of 29.205055237s, submitted: 16
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26da4960
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1196112 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 24977408 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:40.531691+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 24977408 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:41.531854+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:42.532042+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9dbf000/0x0/0x4ffc00000, data 0x17f064b/0x18ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:43.532216+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:44.532396+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1196112 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:45.532585+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9dbf000/0x0/0x4ffc00000, data 0x17f064b/0x18ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26c2b2c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107642880 unmapped: 24805376 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:46.532780+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef4800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107642880 unmapped: 24805376 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:47.532924+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892dc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:48.533111+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:49.533261+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243041 data_alloc: 234881024 data_used: 11091968
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:50.533451+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:51.533630+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:52.533767+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:53.533919+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb2719dc20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:54.534028+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243041 data_alloc: 234881024 data_used: 11091968
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109166592 unmapped: 23281664 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:55.534241+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109166592 unmapped: 23281664 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:56.534367+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109166592 unmapped: 23281664 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:57.534491+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: mgrc ms_handle_reset ms_handle_reset con 0x55fb26150000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3885409716
Dec 06 10:17:30 compute-1 ceph-osd[77465]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3885409716,v1:192.168.122.100:6801/3885409716]
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: get_auth_request con 0x55fb26b42c00 auth_method 0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: mgrc handle_mgr_configure stats_period=5
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109133824 unmapped: 23314432 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.384727478s of 18.432491302s, submitted: 9
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:58.534587+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9a06000/0x0/0x4ffc00000, data 0x1ba866e/0x1c66000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb268d21e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112214016 unmapped: 20234240 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:59.534747+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1283175 data_alloc: 234881024 data_used: 11640832
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112279552 unmapped: 20168704 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:00.534919+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112279552 unmapped: 20168704 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:01.535119+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112279552 unmapped: 20168704 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:02.535320+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:03.535505+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:04.535697+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1282131 data_alloc: 234881024 data_used: 11640832
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:05.535824+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:06.535955+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:07.536087+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:08.536261+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:09.536418+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.296810150s of 11.368459702s, submitted: 31
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1282111 data_alloc: 234881024 data_used: 11636736
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:10.536635+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:11.536784+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892dc00 session 0x55fb271ad680
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb268fe000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:12.536936+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb25e2a1e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:13.537077+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:14.537272+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163642 data_alloc: 218103808 data_used: 4792320
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:15.537439+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:16.537594+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109772800 unmapped: 22675456 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:17.537736+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:18.537982+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:19.538172+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.109940529s of 10.186728477s, submitted: 30
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163202 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:20.538476+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:21.538623+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:22.538779+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:23.539002+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:24.539222+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163070 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:25.539445+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:26.539711+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:27.539866+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:28.540061+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:29.540227+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163070 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:30.540458+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:31.540634+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:32.540899+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:33.541253+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:34.541414+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163070 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109772800 unmapped: 22675456 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.609736443s of 15.626793861s, submitted: 5
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:35.541604+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb268d32c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:36.541765+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:37.541957+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:38.542130+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99b7000/0x0/0x4ffc00000, data 0x17e864b/0x18a5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:39.542319+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209438 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb26649e00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 25722880 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:40.542527+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 25722880 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:41.542647+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef4800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb268f6960
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110706688 unmapped: 25419776 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:42.542762+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:43.542897+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:44.543084+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1251390 data_alloc: 234881024 data_used: 10969088
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:45.543270+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:46.543437+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:47.543608+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:48.543778+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:49.543930+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1251390 data_alloc: 234881024 data_used: 10969088
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:50.544225+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:51.544384+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e9000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.847808838s of 16.891704559s, submitted: 6
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb240c2f00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:52.544535+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 26992640 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892dc00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:53.544698+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115073024 unmapped: 24731648 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:54.544863+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115073024 unmapped: 24731648 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1386308 data_alloc: 234881024 data_used: 11198464
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:55.545043+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8877000/0x0/0x4ffc00000, data 0x292864b/0x29e5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892d800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d800 session 0x55fb240c10e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:56.545209+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb271acd20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26a99c20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb268fe1e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:57.545386+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e9000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:58.545586+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:59.545748+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119521280 unmapped: 20283392 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1471204 data_alloc: 234881024 data_used: 23625728
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:00.545969+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8874000/0x0/0x4ffc00000, data 0x292b64b/0x29e8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:01.546127+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:02.546310+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:03.546723+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:04.546877+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8874000/0x0/0x4ffc00000, data 0x292b64b/0x29e8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1472076 data_alloc: 234881024 data_used: 23629824
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:05.547071+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8874000/0x0/0x4ffc00000, data 0x292b64b/0x29e8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:06.547208+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:07.547352+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:08.547532+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:09.547668+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 14589952 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.025381088s of 17.272668839s, submitted: 73
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f84cf000/0x0/0x4ffc00000, data 0x2ca464b/0x2d61000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1526224 data_alloc: 234881024 data_used: 23797760
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:10.547862+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129245184 unmapped: 10559488 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:11.548010+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129335296 unmapped: 10469376 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:12.548193+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 10387456 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:13.548360+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 10387456 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f814b000/0x0/0x4ffc00000, data 0x305364b/0x3110000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:14.548525+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 10371072 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1530806 data_alloc: 234881024 data_used: 23859200
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:15.548712+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 10371072 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:16.548872+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 10371072 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:17.549013+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 10346496 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:18.549240+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:19.549409+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1528822 data_alloc: 234881024 data_used: 23859200
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:20.549616+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:21.549769+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:22.550139+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:23.550307+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:24.550456+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1528822 data_alloc: 234881024 data_used: 23859200
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:25.550600+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.850658417s of 15.982189178s, submitted: 58
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb26ae6b40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb25e2cb40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:26.550762+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:27.550941+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:28.551103+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f91ec000/0x0/0x4ffc00000, data 0x1c3d64b/0x1cfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:29.551297+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f91ec000/0x0/0x4ffc00000, data 0x1c3d64b/0x1cfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1305206 data_alloc: 234881024 data_used: 11198464
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:30.551503+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:31.551671+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:32.551838+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f91ec000/0x0/0x4ffc00000, data 0x1c3d64b/0x1cfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb25107c20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb268fad20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:33.551980+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115548160 unmapped: 24256512 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb24e894a0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:34.552179+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115556352 unmapped: 24248320 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:35.552351+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:36.552501+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:37.552674+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:38.552833+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:39.553012+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:40.553278+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:41.553421+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:42.553602+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:43.553766+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:44.554330+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:45.554552+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:46.554744+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:47.554952+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:48.555191+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:49.555495+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:50.555737+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:51.555920+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:52.556199+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:53.556356+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:54.556474+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:55.556691+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:56.556921+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:57.557077+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:58.557227+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb24bd50e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e9000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb24f02000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb26ae7a40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:59.557379+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb24bb85a0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 33.909008026s of 33.959445953s, submitted: 21
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb2422b680
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26ae6000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb24bd5680
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e9000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115572736 unmapped: 24231936 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb24f03c20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e9000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb24bd4960
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:00.557610+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b13000/0x0/0x4ffc00000, data 0x168c64b/0x1749000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222243 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:01.557764+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:02.557866+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b13000/0x0/0x4ffc00000, data 0x168c64b/0x1749000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:03.558010+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26a972c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb240c32c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:04.558182+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb240c21e0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb24e89c20
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:05.558342+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224048 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115572736 unmapped: 24231936 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:06.558515+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892d400
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:07.558649+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:08.558802+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b12000/0x0/0x4ffc00000, data 0x168c66e/0x174a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:09.558925+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:10.559092+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1249716 data_alloc: 218103808 data_used: 8503296
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:11.559290+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b12000/0x0/0x4ffc00000, data 0x168c66e/0x174a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:12.559502+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:13.559656+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:14.559812+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:15.559995+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1249716 data_alloc: 218103808 data_used: 8503296
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b12000/0x0/0x4ffc00000, data 0x168c66e/0x174a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:16.560110+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:17.560273+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.381628036s of 18.489994049s, submitted: 27
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:18.560434+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 20135936 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:19.560568+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:20.560781+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326434 data_alloc: 218103808 data_used: 8761344
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:21.560969+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9225000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:22.561096+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:23.561262+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9225000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:24.561456+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:25.561623+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326434 data_alloc: 218103808 data_used: 8761344
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:26.561775+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:27.561939+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:28.562116+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9225000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:29.562250+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:30.562455+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326434 data_alloc: 218103808 data_used: 8761344
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:31.562617+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:32.562765+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:33.562900+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.395916939s of 15.577631950s, submitted: 77
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb26c2a5a0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9234000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892c800
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118964224 unmapped: 20840448 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892c800 session 0x55fb25e2cb40
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:34.563028+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:35.563194+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:36.563309+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:37.563427+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:38.563548+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:39.563691+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:40.563863+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:41.563993+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:42.564139+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26c192c0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:43.564317+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:44.564455+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:45.564618+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:46.564823+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:47.565037+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:48.565235+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:49.565361+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:50.565547+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:51.565674+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:52.565816+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:53.565988+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:54.566267+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:55.566766+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:56.566963+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:57.567125+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:58.567348+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:59.567540+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:00.567730+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:01.567903+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:02.568067+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:03.568222+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:04.568664+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:05.568968+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:06.569263+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb26f38000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:07.569622+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:08.569953+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:09.570128+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 35.909954071s of 36.011264801s, submitted: 36
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:10.570366+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:11.570532+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:12.570807+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:13.570999+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:14.571214+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:15.571413+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:16.571587+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:17.571762+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:18.571914+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:19.572068+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:20.572304+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:21.572462+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:22.572651+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:23.572883+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:24.573047+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:25.573221+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:26.573384+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:27.573531+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:28.573733+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:29.573909+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892dc00 session 0x55fb26da45a0
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:30.574101+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.256072998s of 21.260541916s, submitted: 1
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:31.574263+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:32.574425+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:33.574545+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:34.574697+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:35.574861+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:36.575010+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:37.575245+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:38.575539+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:39.575696+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:40.575886+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.078483582s of 10.083856583s, submitted: 1
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192880 data_alloc: 218103808 data_used: 4796416
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:41.576011+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:42.576201+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:43.576358+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:44.576491+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:45.576643+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192896 data_alloc: 218103808 data_used: 4792320
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:46.576788+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:47.576934+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:48.577081+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:49.577235+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:50.577432+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192896 data_alloc: 218103808 data_used: 4792320
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:51.577632+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:52.577775+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:53.577912+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:54.578055+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:55.578196+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:17:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192896 data_alloc: 218103808 data_used: 4792320
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:56.578322+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115810304 unmapped: 23994368 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:57.578439+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.716539383s of 16.728366852s, submitted: 3
Dec 06 10:17:30 compute-1 ceph-osd[77465]: do_command 'config diff' '{prefix=config diff}'
Dec 06 10:17:30 compute-1 ceph-osd[77465]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 06 10:17:30 compute-1 ceph-osd[77465]: do_command 'config show' '{prefix=config show}'
Dec 06 10:17:30 compute-1 ceph-osd[77465]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 06 10:17:30 compute-1 ceph-osd[77465]: do_command 'counter dump' '{prefix=counter dump}'
Dec 06 10:17:30 compute-1 ceph-osd[77465]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 06 10:17:30 compute-1 ceph-osd[77465]: do_command 'counter schema' '{prefix=counter schema}'
Dec 06 10:17:30 compute-1 ceph-osd[77465]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 24240128 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:58.578602+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115302400 unmapped: 24502272 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:59.578804+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:17:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:00.578973+0000)
Dec 06 10:17:30 compute-1 ceph-osd[77465]: do_command 'log dump' '{prefix=log dump}'
Dec 06 10:17:31 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:17:31 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4105181392' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:17:31 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 10:17:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:31.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:31.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:31 compute-1 ceph-mon[79770]: from='client.17340 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:31 compute-1 ceph-mon[79770]: from='client.25712 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:31 compute-1 ceph-mon[79770]: from='client.17346 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:31 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/96602646' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:17:31 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2197448062' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:17:31 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1967292708' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:17:31 compute-1 ceph-mon[79770]: from='client.26671 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:31 compute-1 ceph-mon[79770]: from='client.17364 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:31 compute-1 ceph-mon[79770]: from='client.25727 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:31 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1474324905' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:17:31 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/4105181392' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:17:31 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2642598236' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:17:31 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 06 10:17:31 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/519140837' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:17:31 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec 06 10:17:31 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2058821366' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 06 10:17:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:17:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:17:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:17:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:17:32 compute-1 crontab[240366]: (root) LIST (root)
Dec 06 10:17:32 compute-1 ceph-mon[79770]: from='client.26698 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:32 compute-1 ceph-mon[79770]: from='client.17385 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:32 compute-1 ceph-mon[79770]: from='client.25742 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:32 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/154680080' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:17:32 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/519140837' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:17:32 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1553814133' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 06 10:17:32 compute-1 ceph-mon[79770]: from='client.26722 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:32 compute-1 ceph-mon[79770]: from='client.17412 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:32 compute-1 ceph-mon[79770]: from='client.25751 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:32 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1911995297' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 06 10:17:32 compute-1 ceph-mon[79770]: pgmap v1094: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:17:32 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2058821366' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 06 10:17:32 compute-1 ceph-mon[79770]: from='client.26737 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:32 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4055719220' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 06 10:17:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Dec 06 10:17:32 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1815850061' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 06 10:17:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:33.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:33.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec 06 10:17:33 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3149206455' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: from='client.17439 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: from='client.25763 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: from='client.26752 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: from='client.17454 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: from='client.25775 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: from='client.26767 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: from='client.17475 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1749799299' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/547471679' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1815850061' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: from='client.25787 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4209183984' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4191974410' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2079500196' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3149206455' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec 06 10:17:33 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2924573047' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 06 10:17:33 compute-1 podman[240544]: 2025-12-06 10:17:33.762939613 +0000 UTC m=+0.062277041 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, container_name=multipathd)
Dec 06 10:17:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec 06 10:17:33 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2910934850' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 06 10:17:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:17:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec 06 10:17:34 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1996211133' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec 06 10:17:34 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/745328308' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec 06 10:17:34 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3333722021' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.17487 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.25808 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3062865665' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1400424366' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3270305602' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2924573047' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/416495288' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2910934850' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: pgmap v1095: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/117394827' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/916825396' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3574515276' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1996211133' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/619943755' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/745328308' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4152843420' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3824839772' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2022830260' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3333722021' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 06 10:17:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec 06 10:17:34 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3551154037' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 06 10:17:34 compute-1 sudo[240718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:17:34 compute-1 sudo[240718]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:17:34 compute-1 sudo[240718]: pam_unix(sudo:session): session closed for user root
Dec 06 10:17:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec 06 10:17:34 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3789003523' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 06 10:17:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:17:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:35 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:35.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:35 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:35.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:35 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec 06 10:17:35 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2063758053' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec 06 10:17:35 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1968791473' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2982586928' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3551154037' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4038371396' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2882127279' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/508111638' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3789003523' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1564210752' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2990078285' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4265932852' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/487764021' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2063758053' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1968791473' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/963798927' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3486939175' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4113509356' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 06 10:17:35 compute-1 systemd[1]: Starting Hostname Service...
Dec 06 10:17:35 compute-1 systemd[1]: Started Hostname Service.
Dec 06 10:17:35 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec 06 10:17:35 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1457285215' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 06 10:17:35 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec 06 10:17:35 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/894360360' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 06 10:17:36 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec 06 10:17:36 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1794527057' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 06 10:17:36 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec 06 10:17:36 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/987939842' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 06 10:17:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/394227616' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 06 10:17:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1457285215' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 06 10:17:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/894360360' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 06 10:17:36 compute-1 ceph-mon[79770]: pgmap v1096: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:17:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1474186761' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 06 10:17:36 compute-1 ceph-mon[79770]: from='client.26941 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2789134809' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 06 10:17:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1794527057' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 06 10:17:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/987939842' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 06 10:17:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1260105720' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 06 10:17:36 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec 06 10:17:36 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/558961422' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 06 10:17:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:17:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:17:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:17:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:17:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:17:37 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:17:37 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:37.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:37.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:17:37 compute-1 ceph-mon[79770]: from='client.26947 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:37 compute-1 ceph-mon[79770]: from='client.17619 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:37 compute-1 ceph-mon[79770]: from='client.26959 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:37 compute-1 ceph-mon[79770]: from='client.26971 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/558961422' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 06 10:17:37 compute-1 ceph-mon[79770]: from='client.25940 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:37 compute-1 ceph-mon[79770]: from='client.26989 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:37 compute-1 ceph-mon[79770]: from='client.17634 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:37 compute-1 ceph-mon[79770]: from='client.26986 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2804408455' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec 06 10:17:38 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2629627461' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Dec 06 10:17:38 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2453809493' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.25955 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.25961 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.27001 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.17643 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.17655 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.25979 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.27025 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2453137415' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2917337549' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: pgmap v1097: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.17670 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.27043 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.25994 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2629627461' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2392546310' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/979905628' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3607230830' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 06 10:17:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:17:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec 06 10:17:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1616168784' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:17:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:17:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:39.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:39 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:17:39 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:39.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:17:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:17:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec 06 10:17:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2688295501' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='client.17694 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='client.17700 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='client.26015 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2453809493' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='client.17721 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4083141842' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='client.26027 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1616168784' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/966773222' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2688295501' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:17:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:17:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:17:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:17:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:17:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='client.17745 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='client.26045 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='client.26072 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: pgmap v1098: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1975698630' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4131632034' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:17:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2870637964' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.834504) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016260834596, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2599, "num_deletes": 251, "total_data_size": 6459693, "memory_usage": 6555664, "flush_reason": "Manual Compaction"}
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016260854665, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4202135, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31202, "largest_seqno": 33796, "table_properties": {"data_size": 4191058, "index_size": 6931, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 26531, "raw_average_key_size": 21, "raw_value_size": 4167766, "raw_average_value_size": 3421, "num_data_blocks": 296, "num_entries": 1218, "num_filter_entries": 1218, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016059, "oldest_key_time": 1765016059, "file_creation_time": 1765016260, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 20218 microseconds, and 8487 cpu microseconds.
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.854716) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4202135 bytes OK
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.854743) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.856182) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.856196) EVENT_LOG_v1 {"time_micros": 1765016260856192, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.856213) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6447513, prev total WAL file size 6447513, number of live WAL files 2.
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.857648) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(4103KB)], [60(12MB)]
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016260857859, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 17353577, "oldest_snapshot_seqno": -1}
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6530 keys, 15136589 bytes, temperature: kUnknown
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016260931779, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 15136589, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15093088, "index_size": 26045, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16389, "raw_key_size": 167539, "raw_average_key_size": 25, "raw_value_size": 14975612, "raw_average_value_size": 2293, "num_data_blocks": 1047, "num_entries": 6530, "num_filter_entries": 6530, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765016260, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.932191) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 15136589 bytes
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.933855) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 234.4 rd, 204.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.5 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(7.7) write-amplify(3.6) OK, records in: 7051, records dropped: 521 output_compression: NoCompression
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.933878) EVENT_LOG_v1 {"time_micros": 1765016260933865, "job": 36, "event": "compaction_finished", "compaction_time_micros": 74038, "compaction_time_cpu_micros": 33777, "output_level": 6, "num_output_files": 1, "total_output_size": 15136589, "num_input_records": 7051, "num_output_records": 6530, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016260934659, "job": 36, "event": "table_file_deletion", "file_number": 62}
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016260936606, "job": 36, "event": "table_file_deletion", "file_number": 60}
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.857519) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.936665) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.936670) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.936672) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.936674) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:40 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:17:40.936676) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:17:41 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Dec 06 10:17:41 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1892473375' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 06 10:17:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:41.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:41.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:41 compute-1 ceph-mon[79770]: from='client.27187 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:41 compute-1 ceph-mon[79770]: from='client.17835 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:41 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1892473375' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 06 10:17:41 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2109662307' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 06 10:17:41 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2595427889' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 06 10:17:41 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3040169991' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 06 10:17:41 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1895847428' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 06 10:17:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:17:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:17:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:17:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:17:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec 06 10:17:42 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1044876254' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 06 10:17:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Dec 06 10:17:42 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3138446401' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 06 10:17:42 compute-1 ceph-mon[79770]: from='client.26150 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:42 compute-1 ceph-mon[79770]: pgmap v1099: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:17:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1044876254' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 06 10:17:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2211086827' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 06 10:17:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1756315161' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 06 10:17:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3138446401' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 06 10:17:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1991843052' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 06 10:17:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Dec 06 10:17:42 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1645793006' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 06 10:17:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Dec 06 10:17:43 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/539913026' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 06 10:17:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:43.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:43.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:43 compute-1 ceph-mon[79770]: from='client.27268 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:43 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1694569205' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 06 10:17:43 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1645793006' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 06 10:17:43 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1443159653' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 06 10:17:43 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/539913026' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 06 10:17:43 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2121329675' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 06 10:17:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:17:44 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Dec 06 10:17:44 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/22305521' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 06 10:17:44 compute-1 ceph-mon[79770]: from='client.17889 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:44 compute-1 ceph-mon[79770]: from='client.27307 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:44 compute-1 ceph-mon[79770]: from='client.26192 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:44 compute-1 ceph-mon[79770]: pgmap v1100: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:17:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2968187861' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 06 10:17:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1515055382' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec 06 10:17:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1000968456' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 06 10:17:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/22305521' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 06 10:17:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/259075150' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec 06 10:17:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:45.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:45.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:45 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Dec 06 10:17:45 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/917006680' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec 06 10:17:46 compute-1 nova_compute[228576]: 2025-12-06 10:17:46.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:46 compute-1 nova_compute[228576]: 2025-12-06 10:17:46.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:17:46 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Dec 06 10:17:46 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1990391975' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec 06 10:17:46 compute-1 ovs-appctl[242760]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 06 10:17:46 compute-1 ovs-appctl[242764]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 06 10:17:46 compute-1 ovs-appctl[242770]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 06 10:17:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:17:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:17:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:17:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:17:47 compute-1 ceph-mon[79770]: from='client.17922 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:47 compute-1 ceph-mon[79770]: from='client.27334 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:47 compute-1 ceph-mon[79770]: from='client.27340 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2038668301' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec 06 10:17:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/917006680' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Dec 06 10:17:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3502764016' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Dec 06 10:17:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:47.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:47.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.26210 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.17943 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.17961 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.26228 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: pgmap v1101: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1531041810' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.26240 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2349497416' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1990391975' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.17988 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/2700179710' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/2700179710' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.17994 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.27397 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3610447336' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2607835880' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.27409 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: pgmap v1102: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3586252455' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.26261 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/950858382' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Dec 06 10:17:48 compute-1 nova_compute[228576]: 2025-12-06 10:17:48.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:48 compute-1 nova_compute[228576]: 2025-12-06 10:17:48.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:48 compute-1 nova_compute[228576]: 2025-12-06 10:17:48.500 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:17:48 compute-1 nova_compute[228576]: 2025-12-06 10:17:48.501 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:17:48 compute-1 nova_compute[228576]: 2025-12-06 10:17:48.501 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:17:48 compute-1 nova_compute[228576]: 2025-12-06 10:17:48.501 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:17:48 compute-1 nova_compute[228576]: 2025-12-06 10:17:48.502 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:17:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Dec 06 10:17:48 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1455478023' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Dec 06 10:17:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:17:49 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:17:49 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1693304027' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:17:49 compute-1 nova_compute[228576]: 2025-12-06 10:17:49.084 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.582s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:17:49 compute-1 nova_compute[228576]: 2025-12-06 10:17:49.267 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:17:49 compute-1 nova_compute[228576]: 2025-12-06 10:17:49.269 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4988MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:17:49 compute-1 nova_compute[228576]: 2025-12-06 10:17:49.269 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:17:49 compute-1 nova_compute[228576]: 2025-12-06 10:17:49.269 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:17:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1740259575' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:17:49 compute-1 ceph-mon[79770]: from='client.18036 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:49 compute-1 ceph-mon[79770]: from='client.26267 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/820439724' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Dec 06 10:17:49 compute-1 ceph-mon[79770]: from='client.27448 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:49 compute-1 ceph-mon[79770]: from='client.18045 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1455478023' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Dec 06 10:17:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1693304027' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:17:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4160220492' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:17:49 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Dec 06 10:17:49 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3508033125' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Dec 06 10:17:49 compute-1 nova_compute[228576]: 2025-12-06 10:17:49.347 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:17:49 compute-1 nova_compute[228576]: 2025-12-06 10:17:49.347 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:17:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:49.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:17:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:49.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:17:49 compute-1 nova_compute[228576]: 2025-12-06 10:17:49.402 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:17:49 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:17:49 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3766028525' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:17:49 compute-1 nova_compute[228576]: 2025-12-06 10:17:49.885 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:17:49 compute-1 nova_compute[228576]: 2025-12-06 10:17:49.893 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:17:49 compute-1 nova_compute[228576]: 2025-12-06 10:17:49.914 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:17:49 compute-1 nova_compute[228576]: 2025-12-06 10:17:49.916 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:17:49 compute-1 nova_compute[228576]: 2025-12-06 10:17:49.916 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.647s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:17:50 compute-1 ceph-mon[79770]: from='client.27469 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:50 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3508033125' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Dec 06 10:17:50 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/242421932' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 06 10:17:50 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1985334892' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 06 10:17:50 compute-1 ceph-mon[79770]: from='client.26303 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:50 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3766028525' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:17:50 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2876669399' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Dec 06 10:17:50 compute-1 ceph-mon[79770]: pgmap v1103: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:17:50 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1751496017' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Dec 06 10:17:50 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/739045502' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:50 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec 06 10:17:50 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2127390853' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 06 10:17:50 compute-1 nova_compute[228576]: 2025-12-06 10:17:50.916 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:50 compute-1 nova_compute[228576]: 2025-12-06 10:17:50.917 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:51 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Dec 06 10:17:51 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3232134178' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Dec 06 10:17:51 compute-1 ceph-mon[79770]: from='client.26315 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:17:51 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2380510369' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:51 compute-1 ceph-mon[79770]: from='client.18099 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:51 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2127390853' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 06 10:17:51 compute-1 ceph-mon[79770]: from='client.27514 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:51 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3494036148' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:17:51 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3232134178' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Dec 06 10:17:51 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2132475846' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:17:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:17:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:51.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:17:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:51.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:51 compute-1 nova_compute[228576]: 2025-12-06 10:17:51.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:51 compute-1 nova_compute[228576]: 2025-12-06 10:17:51.472 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:51 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Dec 06 10:17:51 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3949318305' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:17:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:17:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:17:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:17:52 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3949318305' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:52 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3484183676' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Dec 06 10:17:52 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1987338407' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Dec 06 10:17:52 compute-1 ceph-mon[79770]: pgmap v1104: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:17:52 compute-1 ceph-mon[79770]: from='client.26348 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:52 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/130199360' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:52 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2505701622' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Dec 06 10:17:52 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2596899974' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:17:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Dec 06 10:17:52 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/562050907' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Dec 06 10:17:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Dec 06 10:17:53 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1958371454' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:17:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:53.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:53 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:53 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:53.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:53 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2596899974' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:17:53 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/860231069' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Dec 06 10:17:53 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2093660645' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Dec 06 10:17:53 compute-1 ceph-mon[79770]: from='client.18141 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:53 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/562050907' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Dec 06 10:17:53 compute-1 ceph-mon[79770]: from='client.27574 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:53 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1626240640' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Dec 06 10:17:53 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1958371454' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Dec 06 10:17:53 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/762392159' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Dec 06 10:17:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:17:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:17:54.292 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:17:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:17:54.294 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:17:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:17:54.295 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:17:54 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Dec 06 10:17:54 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2059299300' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Dec 06 10:17:54 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2941709570' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Dec 06 10:17:54 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/762392159' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Dec 06 10:17:54 compute-1 ceph-mon[79770]: pgmap v1105: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:17:54 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/48299577' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:17:54 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1855919859' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:55 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Dec 06 10:17:55 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2704565734' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:55.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:55.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:55 compute-1 nova_compute[228576]: 2025-12-06 10:17:55.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:55 compute-1 nova_compute[228576]: 2025-12-06 10:17:55.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:17:55 compute-1 nova_compute[228576]: 2025-12-06 10:17:55.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:17:55 compute-1 nova_compute[228576]: 2025-12-06 10:17:55.497 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:17:55 compute-1 sudo[244449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:17:55 compute-1 sudo[244449]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:17:55 compute-1 sudo[244449]: pam_unix(sudo:session): session closed for user root
Dec 06 10:17:55 compute-1 ceph-mon[79770]: from='client.27604 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:55 compute-1 ceph-mon[79770]: from='client.18180 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:55 compute-1 ceph-mon[79770]: from='client.26393 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1211427813' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Dec 06 10:17:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2059299300' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Dec 06 10:17:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1663530059' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Dec 06 10:17:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2704565734' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2626797575' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:56 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Dec 06 10:17:56 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/897653303' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Dec 06 10:17:56 compute-1 ceph-mon[79770]: from='client.18198 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:56 compute-1 ceph-mon[79770]: from='client.27631 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:56 compute-1 ceph-mon[79770]: from='client.18210 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:56 compute-1 ceph-mon[79770]: from='client.27643 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:56 compute-1 ceph-mon[79770]: from='client.26423 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:56 compute-1 ceph-mon[79770]: pgmap v1106: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:17:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/34457584' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:17:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4020160728' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/897653303' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Dec 06 10:17:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/812042877' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Dec 06 10:17:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/443912516' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Dec 06 10:17:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3394423495' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:17:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:17:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:17:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:17:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:17:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:17:57 compute-1 podman[244616]: 2025-12-06 10:17:57.036666316 +0000 UTC m=+0.180463259 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller)
Dec 06 10:17:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Dec 06 10:17:57 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3855571438' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:57.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:17:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:57.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:17:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Dec 06 10:17:57 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3069230919' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Dec 06 10:17:57 compute-1 ceph-mon[79770]: from='client.26441 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:57 compute-1 ceph-mon[79770]: from='client.18255 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:57 compute-1 ceph-mon[79770]: from='client.27694 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:57 compute-1 ceph-mon[79770]: from='client.18267 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:57 compute-1 ceph-mon[79770]: from='client.26453 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:57 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1210231069' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:17:57 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3855571438' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Dec 06 10:17:57 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3263841297' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:17:57 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3069230919' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Dec 06 10:17:57 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2354413920' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Dec 06 10:17:57 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3483261157' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Dec 06 10:17:58 compute-1 nova_compute[228576]: 2025-12-06 10:17:58.490 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:17:58 compute-1 virtqemud[228188]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 06 10:17:58 compute-1 systemd[1]: Starting Time & Date Service...
Dec 06 10:17:58 compute-1 systemd[1]: Started Time & Date Service.
Dec 06 10:17:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Dec 06 10:17:58 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2207639809' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:17:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:17:59 compute-1 ceph-mon[79770]: from='client.27703 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:17:59 compute-1 ceph-mon[79770]: pgmap v1107: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:17:59 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2207639809' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:17:59 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Dec 06 10:17:59 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1008317242' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Dec 06 10:17:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:17:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:17:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:17:59.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:17:59 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:17:59 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:17:59.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:00 compute-1 ceph-mon[79770]: from='client.26480 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:18:00 compute-1 ceph-mon[79770]: from='client.18303 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:18:00 compute-1 ceph-mon[79770]: from='client.27736 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:18:00 compute-1 ceph-mon[79770]: from='client.26486 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:18:00 compute-1 ceph-mon[79770]: from='client.18315 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:18:00 compute-1 ceph-mon[79770]: from='client.27748 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:18:00 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4242664383' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 06 10:18:00 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2060160573' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 06 10:18:00 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1008317242' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Dec 06 10:18:00 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/346786115' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Dec 06 10:18:00 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2859090236' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Dec 06 10:18:00 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Dec 06 10:18:00 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2167113296' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 06 10:18:01 compute-1 ceph-mon[79770]: from='client.26507 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:18:01 compute-1 ceph-mon[79770]: pgmap v1108: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:01 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2167113296' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Dec 06 10:18:01 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Dec 06 10:18:01 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2123486085' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Dec 06 10:18:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:01.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:01 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:01 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:01.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:01 compute-1 podman[245305]: 2025-12-06 10:18:01.760500116 +0000 UTC m=+0.064001066 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:18:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:18:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:18:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:18:02 compute-1 ceph-mon[79770]: from='client.26513 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:18:02 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2123486085' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Dec 06 10:18:03 compute-1 ceph-mon[79770]: pgmap v1109: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:03.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:03.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:18:04 compute-1 ceph-mon[79770]: pgmap v1110: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:18:04 compute-1 podman[245327]: 2025-12-06 10:18:04.752862265 +0000 UTC m=+0.059099734 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:18:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:18:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:05.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:18:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:05.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:18:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:18:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:18:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:18:07 compute-1 ceph-mon[79770]: pgmap v1111: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:07.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:07.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:08 compute-1 ceph-mon[79770]: pgmap v1112: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:18:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:18:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:18:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:09.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:18:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:09.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:18:10 compute-1 ceph-mon[79770]: pgmap v1113: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:11.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:18:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:11.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:18:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:18:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:18:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:18:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:18:13 compute-1 ceph-mon[79770]: pgmap v1114: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:13.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:13.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:18:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:15.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:15 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:15 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:15.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:15 compute-1 sudo[245353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:18:15 compute-1 sudo[245353]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:18:15 compute-1 sudo[245353]: pam_unix(sudo:session): session closed for user root
Dec 06 10:18:15 compute-1 ceph-mon[79770]: pgmap v1115: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:18:16 compute-1 ceph-mon[79770]: pgmap v1116: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:18:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:18:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:18:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:18:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:17.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:17.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:17 compute-1 sudo[245379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:18:17 compute-1 sudo[245379]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:18:17 compute-1 sudo[245379]: pam_unix(sudo:session): session closed for user root
Dec 06 10:18:17 compute-1 sudo[245404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:18:17 compute-1 sudo[245404]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:18:18 compute-1 sudo[245404]: pam_unix(sudo:session): session closed for user root
Dec 06 10:18:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:18:19 compute-1 ceph-mon[79770]: pgmap v1117: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:18:19 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:18:19 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:18:19 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:18:19 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:18:19 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:18:19 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:18:19 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:18:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:19.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:19.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:20 compute-1 ceph-mon[79770]: pgmap v1118: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:21.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:21.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:18:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:18:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:18:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:18:23 compute-1 ceph-mon[79770]: pgmap v1119: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:23.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:23.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:23 compute-1 sudo[245464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:18:23 compute-1 sudo[245464]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:18:23 compute-1 sudo[245464]: pam_unix(sudo:session): session closed for user root
Dec 06 10:18:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:18:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:18:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:18:24 compute-1 ceph-mon[79770]: pgmap v1120: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:18:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:18:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:25.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:25 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:25 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:25.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:26 compute-1 ceph-mon[79770]: pgmap v1121: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:18:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:18:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:18:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:18:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:27 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:27 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:27.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:27.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:27 compute-1 podman[245491]: 2025-12-06 10:18:27.806097953 +0000 UTC m=+0.104111499 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 06 10:18:28 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 06 10:18:28 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 06 10:18:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:18:29 compute-1 ceph-mon[79770]: pgmap v1122: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:18:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:29.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:29 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:29 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:29.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:31 compute-1 ceph-mon[79770]: pgmap v1123: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:31.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:31 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:31 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:31.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:18:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:18:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:18:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:18:32 compute-1 ceph-mon[79770]: pgmap v1124: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:32 compute-1 podman[245524]: 2025-12-06 10:18:32.408730532 +0000 UTC m=+0.061057803 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:18:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:33.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:33 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:33 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:33.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:18:35 compute-1 ceph-mon[79770]: pgmap v1125: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:18:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:35 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:35 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:35.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:35.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:35 compute-1 podman[245546]: 2025-12-06 10:18:35.747048837 +0000 UTC m=+0.055468065 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:18:35 compute-1 sudo[245567]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:18:35 compute-1 sudo[245567]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:18:35 compute-1 sudo[245567]: pam_unix(sudo:session): session closed for user root
Dec 06 10:18:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:18:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:18:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:18:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:18:37 compute-1 ceph-mon[79770]: pgmap v1126: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:37.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:37 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:37 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:37.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:38 compute-1 ceph-mon[79770]: pgmap v1127: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:18:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:18:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:18:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:39 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:39 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:39.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:39.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:40 compute-1 ceph-mon[79770]: pgmap v1128: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:41 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:41.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:41 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:41.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:18:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:18:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:18:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:18:43 compute-1 ceph-mon[79770]: pgmap v1129: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:43 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:43 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:43.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:43.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:18:45 compute-1 ceph-mon[79770]: pgmap v1130: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:18:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:18:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:45.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:18:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:45 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:45 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:45.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:46 compute-1 ceph-mon[79770]: pgmap v1131: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3435933179' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:18:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/3435933179' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:18:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:18:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:18:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:18:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:18:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:47.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:47 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:18:47 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:47.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:18:48 compute-1 ceph-mon[79770]: pgmap v1132: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:18:48 compute-1 nova_compute[228576]: 2025-12-06 10:18:48.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:48 compute-1 nova_compute[228576]: 2025-12-06 10:18:48.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:48 compute-1 nova_compute[228576]: 2025-12-06 10:18:48.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:18:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:18:49 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/792511093' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:18:49 compute-1 nova_compute[228576]: 2025-12-06 10:18:49.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:49 compute-1 nova_compute[228576]: 2025-12-06 10:18:49.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:18:49 compute-1 nova_compute[228576]: 2025-12-06 10:18:49.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:18:49 compute-1 nova_compute[228576]: 2025-12-06 10:18:49.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:18:49 compute-1 nova_compute[228576]: 2025-12-06 10:18:49.496 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:18:49 compute-1 nova_compute[228576]: 2025-12-06 10:18:49.496 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:18:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:49 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:18:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:49.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:49 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:49.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:18:49 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:18:49 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2553381759' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:18:49 compute-1 nova_compute[228576]: 2025-12-06 10:18:49.953 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:18:50 compute-1 nova_compute[228576]: 2025-12-06 10:18:50.126 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:18:50 compute-1 nova_compute[228576]: 2025-12-06 10:18:50.128 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5053MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:18:50 compute-1 nova_compute[228576]: 2025-12-06 10:18:50.128 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:18:50 compute-1 nova_compute[228576]: 2025-12-06 10:18:50.129 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:18:50 compute-1 nova_compute[228576]: 2025-12-06 10:18:50.187 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:18:50 compute-1 nova_compute[228576]: 2025-12-06 10:18:50.188 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:18:50 compute-1 nova_compute[228576]: 2025-12-06 10:18:50.221 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:18:50 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2553381759' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:18:50 compute-1 ceph-mon[79770]: pgmap v1133: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:50 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1909379044' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:18:50 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:18:50 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/748120430' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:18:50 compute-1 nova_compute[228576]: 2025-12-06 10:18:50.680 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:18:50 compute-1 nova_compute[228576]: 2025-12-06 10:18:50.687 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:18:50 compute-1 nova_compute[228576]: 2025-12-06 10:18:50.706 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:18:50 compute-1 nova_compute[228576]: 2025-12-06 10:18:50.708 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:18:50 compute-1 nova_compute[228576]: 2025-12-06 10:18:50.708 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:18:51 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/748120430' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:18:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:18:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:51.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:51 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:18:51 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:51.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:18:51 compute-1 nova_compute[228576]: 2025-12-06 10:18:51.709 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:51 compute-1 nova_compute[228576]: 2025-12-06 10:18:51.742 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:51 compute-1 nova_compute[228576]: 2025-12-06 10:18:51.743 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:51 compute-1 nova_compute[228576]: 2025-12-06 10:18:51.743 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:18:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:18:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:18:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:18:52 compute-1 ceph-mon[79770]: pgmap v1134: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:53 compute-1 sudo[238238]: pam_unix(sudo:session): session closed for user root
Dec 06 10:18:53 compute-1 sshd-session[238237]: Received disconnect from 192.168.122.10 port 38894:11: disconnected by user
Dec 06 10:18:53 compute-1 sshd-session[238237]: Disconnected from user zuul 192.168.122.10 port 38894
Dec 06 10:18:53 compute-1 sshd-session[238234]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:18:53 compute-1 systemd-logind[788]: Session 55 logged out. Waiting for processes to exit.
Dec 06 10:18:53 compute-1 systemd[1]: session-55.scope: Deactivated successfully.
Dec 06 10:18:53 compute-1 systemd[1]: session-55.scope: Consumed 3min 447ms CPU time, 760.4M memory peak, read 308.9M from disk, written 70.4M to disk.
Dec 06 10:18:53 compute-1 systemd-logind[788]: Removed session 55.
Dec 06 10:18:53 compute-1 sshd-session[245645]: Accepted publickey for zuul from 192.168.122.10 port 41534 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 10:18:53 compute-1 systemd-logind[788]: New session 56 of user zuul.
Dec 06 10:18:53 compute-1 systemd[1]: Started Session 56 of User zuul.
Dec 06 10:18:53 compute-1 sshd-session[245645]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 10:18:53 compute-1 sudo[245649]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-1-2025-12-06-uimfxsg.tar.xz
Dec 06 10:18:53 compute-1 sudo[245649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:18:53 compute-1 nova_compute[228576]: 2025-12-06 10:18:53.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:53.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:18:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:53.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:18:53 compute-1 sudo[245649]: pam_unix(sudo:session): session closed for user root
Dec 06 10:18:53 compute-1 sshd-session[245648]: Received disconnect from 192.168.122.10 port 41534:11: disconnected by user
Dec 06 10:18:53 compute-1 sshd-session[245648]: Disconnected from user zuul 192.168.122.10 port 41534
Dec 06 10:18:53 compute-1 sshd-session[245645]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:18:53 compute-1 systemd[1]: session-56.scope: Deactivated successfully.
Dec 06 10:18:53 compute-1 systemd-logind[788]: Session 56 logged out. Waiting for processes to exit.
Dec 06 10:18:53 compute-1 systemd-logind[788]: Removed session 56.
Dec 06 10:18:53 compute-1 sshd-session[245674]: Accepted publickey for zuul from 192.168.122.10 port 50274 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 10:18:53 compute-1 systemd-logind[788]: New session 57 of user zuul.
Dec 06 10:18:53 compute-1 systemd[1]: Started Session 57 of User zuul.
Dec 06 10:18:53 compute-1 sshd-session[245674]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 10:18:53 compute-1 sudo[245679]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Dec 06 10:18:53 compute-1 sudo[245679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:18:53 compute-1 sudo[245679]: pam_unix(sudo:session): session closed for user root
Dec 06 10:18:53 compute-1 sshd-session[245678]: Received disconnect from 192.168.122.10 port 50274:11: disconnected by user
Dec 06 10:18:53 compute-1 sshd-session[245678]: Disconnected from user zuul 192.168.122.10 port 50274
Dec 06 10:18:53 compute-1 sshd-session[245674]: pam_unix(sshd:session): session closed for user zuul
Dec 06 10:18:53 compute-1 systemd[1]: session-57.scope: Deactivated successfully.
Dec 06 10:18:53 compute-1 systemd-logind[788]: Session 57 logged out. Waiting for processes to exit.
Dec 06 10:18:53 compute-1 systemd-logind[788]: Removed session 57.
Dec 06 10:18:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:18:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:18:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:18:54.294 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:18:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:18:54.296 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:18:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:18:54.296 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:18:55 compute-1 ceph-mon[79770]: pgmap v1135: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:18:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:55.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:55.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:55 compute-1 sudo[245706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:18:55 compute-1 sudo[245706]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:18:55 compute-1 sudo[245706]: pam_unix(sudo:session): session closed for user root
Dec 06 10:18:56 compute-1 ceph-mon[79770]: pgmap v1136: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:18:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:18:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:18:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:18:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:18:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:18:57 compute-1 nova_compute[228576]: 2025-12-06 10:18:57.473 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:57 compute-1 nova_compute[228576]: 2025-12-06 10:18:57.473 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:18:57 compute-1 nova_compute[228576]: 2025-12-06 10:18:57.473 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:18:57 compute-1 nova_compute[228576]: 2025-12-06 10:18:57.495 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:18:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:57.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:18:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:57.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:18:58 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3649759905' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:18:58 compute-1 podman[245733]: 2025-12-06 10:18:58.783176539 +0000 UTC m=+0.088540823 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:18:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:18:59 compute-1 ceph-mon[79770]: pgmap v1137: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:18:59 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3542070725' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:18:59 compute-1 nova_compute[228576]: 2025-12-06 10:18:59.485 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:18:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:18:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:18:59.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:18:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:18:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:18:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:18:59.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:19:01 compute-1 ceph-mon[79770]: pgmap v1138: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:19:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:01.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:19:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:01.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:19:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:19:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:19:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:19:02 compute-1 podman[245763]: 2025-12-06 10:19:02.755072451 +0000 UTC m=+0.060278313 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:19:03 compute-1 ceph-mon[79770]: pgmap v1139: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:03.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:03.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:19:04 compute-1 ceph-mon[79770]: pgmap v1140: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:19:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:19:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:05.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:19:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:05.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:06 compute-1 podman[245786]: 2025-12-06 10:19:06.757042309 +0000 UTC m=+0.063378360 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:19:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:19:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:19:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:19:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:19:07 compute-1 ceph-mon[79770]: pgmap v1141: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:07.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:07.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:19:09 compute-1 ceph-mon[79770]: pgmap v1142: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:19:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:19:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:19:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:09.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:19:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:09 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:09 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:09.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:11 compute-1 ceph-mon[79770]: pgmap v1143: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:11.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:11 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:11 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:11.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:19:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:19:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:19:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:19:13 compute-1 ceph-mon[79770]: pgmap v1144: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:13 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:13.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:13 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:13.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:19:14 compute-1 ceph-mon[79770]: pgmap v1145: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:19:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:15.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:15 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:15 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:15.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:16 compute-1 sudo[245810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:19:16 compute-1 sudo[245810]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:19:16 compute-1 sudo[245810]: pam_unix(sudo:session): session closed for user root
Dec 06 10:19:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:19:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:19:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:19:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:19:17 compute-1 ceph-mon[79770]: pgmap v1146: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:19:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:17.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:19:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:17 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:17 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:17.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:19:19 compute-1 ceph-mon[79770]: pgmap v1147: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:19:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:19.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:19:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:19.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:19:21 compute-1 ceph-mon[79770]: pgmap v1148: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:19:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:21.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:19:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:21 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:21 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:21.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:19:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:19:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:19:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:19:23 compute-1 ceph-mon[79770]: pgmap v1149: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:19:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:23.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:19:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:23 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:23 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:23.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:23 compute-1 sudo[245839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:19:23 compute-1 sudo[245839]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:19:23 compute-1 sudo[245839]: pam_unix(sudo:session): session closed for user root
Dec 06 10:19:23 compute-1 sudo[245864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Dec 06 10:19:23 compute-1 sudo[245864]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:19:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:19:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:19:24 compute-1 sudo[245864]: pam_unix(sudo:session): session closed for user root
Dec 06 10:19:24 compute-1 sudo[245910]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:19:24 compute-1 sudo[245910]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:19:24 compute-1 sudo[245910]: pam_unix(sudo:session): session closed for user root
Dec 06 10:19:24 compute-1 sudo[245935]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:19:24 compute-1 sudo[245935]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:19:25 compute-1 sudo[245935]: pam_unix(sudo:session): session closed for user root
Dec 06 10:19:25 compute-1 ceph-mon[79770]: pgmap v1150: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:19:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:19:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:19:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:19:25 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:19:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:19:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:25.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:19:25 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:25 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:25.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:19:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:19:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:19:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:19:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:19:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:19:26 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:19:26 compute-1 ceph-mon[79770]: pgmap v1151: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:19:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:19:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:19:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:19:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:27.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:27 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:19:27 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:27.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:19:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:19:29 compute-1 ceph-mon[79770]: pgmap v1152: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:19:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:29.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:29 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:29 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:29.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:29 compute-1 podman[245993]: 2025-12-06 10:19:29.811866444 +0000 UTC m=+0.114789993 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:19:30 compute-1 sudo[246020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:19:30 compute-1 sudo[246020]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:19:30 compute-1 sudo[246020]: pam_unix(sudo:session): session closed for user root
Dec 06 10:19:31 compute-1 ceph-mon[79770]: pgmap v1153: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:19:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:19:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:19:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:31.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:19:31 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:31 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:31.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:19:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:19:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:19:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:19:33 compute-1 ceph-mon[79770]: pgmap v1154: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:33 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:19:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:33.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:33 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:33.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:19:33 compute-1 podman[246046]: 2025-12-06 10:19:33.764473258 +0000 UTC m=+0.064155609 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 06 10:19:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:19:35 compute-1 ceph-mon[79770]: pgmap v1155: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:19:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:19:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:35.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:19:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:35 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:35 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:35.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:19:35 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.1 total, 600.0 interval
                                           Cumulative writes: 10K writes, 41K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s
                                           Cumulative WAL: 10K writes, 2997 syncs, 3.65 writes per sync, written: 0.03 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1703 writes, 5619 keys, 1703 commit groups, 1.0 writes per commit group, ingest: 5.35 MB, 0.01 MB/s
                                           Interval WAL: 1703 writes, 744 syncs, 2.29 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:19:36 compute-1 sudo[246066]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:19:36 compute-1 sudo[246066]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:19:36 compute-1 sudo[246066]: pam_unix(sudo:session): session closed for user root
Dec 06 10:19:36 compute-1 ceph-mon[79770]: pgmap v1156: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:19:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:19:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:19:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:19:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:37.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:37 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:37 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:37.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:37 compute-1 podman[246092]: 2025-12-06 10:19:37.745398884 +0000 UTC m=+0.055924135 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true)
Dec 06 10:19:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:19:39 compute-1 ceph-mon[79770]: pgmap v1157: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:19:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:19:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:19:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:39.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:19:39 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:39 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:39.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:41 compute-1 ceph-mon[79770]: pgmap v1158: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:19:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:41.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:19:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:41 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:41 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:41.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:19:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:19:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:19:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:19:43 compute-1 ceph-mon[79770]: pgmap v1159: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:43.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:43 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:19:43 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:43.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:19:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:19:44 compute-1 ceph-mon[79770]: pgmap v1160: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:19:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:45.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:45.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:46 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Dec 06 10:19:46 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1961422206' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:19:46 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Dec 06 10:19:46 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1961422206' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:19:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:19:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:19:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:19:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:19:47 compute-1 ceph-mon[79770]: pgmap v1161: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/1961422206' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:19:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/1961422206' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:19:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:47.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:47 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:47 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:47.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:48 compute-1 nova_compute[228576]: 2025-12-06 10:19:48.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:48 compute-1 nova_compute[228576]: 2025-12-06 10:19:48.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:19:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:19:49 compute-1 ceph-mon[79770]: pgmap v1162: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:19:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:49.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:49 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:49 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:49.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:50 compute-1 nova_compute[228576]: 2025-12-06 10:19:50.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:50 compute-1 nova_compute[228576]: 2025-12-06 10:19:50.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:51 compute-1 ceph-mon[79770]: pgmap v1163: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:51 compute-1 nova_compute[228576]: 2025-12-06 10:19:51.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:51 compute-1 nova_compute[228576]: 2025-12-06 10:19:51.504 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:19:51 compute-1 nova_compute[228576]: 2025-12-06 10:19:51.504 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:19:51 compute-1 nova_compute[228576]: 2025-12-06 10:19:51.505 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:19:51 compute-1 nova_compute[228576]: 2025-12-06 10:19:51.505 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:19:51 compute-1 nova_compute[228576]: 2025-12-06 10:19:51.505 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:19:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:51.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:51.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:51 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:19:51 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3584782970' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:19:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:19:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:19:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:19:52 compute-1 nova_compute[228576]: 2025-12-06 10:19:51.999 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:19:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:19:52 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2562345917' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:19:52 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3584782970' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:19:52 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1289647203' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:19:52 compute-1 nova_compute[228576]: 2025-12-06 10:19:52.147 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:19:52 compute-1 nova_compute[228576]: 2025-12-06 10:19:52.149 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5172MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:19:52 compute-1 nova_compute[228576]: 2025-12-06 10:19:52.149 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:19:52 compute-1 nova_compute[228576]: 2025-12-06 10:19:52.149 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:19:52 compute-1 nova_compute[228576]: 2025-12-06 10:19:52.249 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:19:52 compute-1 nova_compute[228576]: 2025-12-06 10:19:52.250 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:19:52 compute-1 nova_compute[228576]: 2025-12-06 10:19:52.276 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:19:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:19:52 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3001520887' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:19:52 compute-1 nova_compute[228576]: 2025-12-06 10:19:52.736 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:19:52 compute-1 nova_compute[228576]: 2025-12-06 10:19:52.743 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:19:52 compute-1 nova_compute[228576]: 2025-12-06 10:19:52.766 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:19:52 compute-1 nova_compute[228576]: 2025-12-06 10:19:52.768 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:19:52 compute-1 nova_compute[228576]: 2025-12-06 10:19:52.768 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:19:53 compute-1 ceph-mon[79770]: pgmap v1164: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:53 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3001520887' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:19:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:53.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:53.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:53 compute-1 nova_compute[228576]: 2025-12-06 10:19:53.769 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:53 compute-1 nova_compute[228576]: 2025-12-06 10:19:53.770 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:19:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:19:54 compute-1 ceph-mon[79770]: pgmap v1165: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:19:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:19:54.295 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:19:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:19:54.296 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:19:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:19:54.296 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:19:54 compute-1 nova_compute[228576]: 2025-12-06 10:19:54.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:19:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:55.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:19:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:55.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:56 compute-1 sudo[246166]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:19:56 compute-1 sudo[246166]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:19:56 compute-1 sudo[246166]: pam_unix(sudo:session): session closed for user root
Dec 06 10:19:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:19:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:19:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:19:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:19:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:19:57 compute-1 ceph-mon[79770]: pgmap v1166: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:19:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:57.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:57 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:19:57 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:57.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:19:58 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/502386387' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:19:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:19:59 compute-1 ceph-mon[79770]: pgmap v1167: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:19:59 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4279224093' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:19:59 compute-1 nova_compute[228576]: 2025-12-06 10:19:59.463 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:59 compute-1 nova_compute[228576]: 2025-12-06 10:19:59.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:19:59 compute-1 nova_compute[228576]: 2025-12-06 10:19:59.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:19:59 compute-1 nova_compute[228576]: 2025-12-06 10:19:59.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:19:59 compute-1 nova_compute[228576]: 2025-12-06 10:19:59.492 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:19:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:19:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:19:59.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:19:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:19:59 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:19:59 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:19:59.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:00 compute-1 ceph-mon[79770]: overall HEALTH_OK
Dec 06 10:20:00 compute-1 podman[246194]: 2025-12-06 10:20:00.805314405 +0000 UTC m=+0.076977387 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:20:01 compute-1 ceph-mon[79770]: pgmap v1168: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:01.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:01.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:20:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:20:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:20:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:20:02 compute-1 ceph-mon[79770]: pgmap v1169: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:03.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:03 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:03 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:03.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:04 compute-1 podman[246222]: 2025-12-06 10:20:04.750156608 +0000 UTC m=+0.055863715 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 06 10:20:05 compute-1 ceph-mon[79770]: pgmap v1170: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:20:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:05 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:05 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:05.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:05.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:20:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:20:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:20:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:20:07 compute-1 ceph-mon[79770]: pgmap v1171: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:07 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:07 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:07.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:07.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:08 compute-1 podman[246244]: 2025-12-06 10:20:08.75640063 +0000 UTC m=+0.066412615 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:20:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:09 compute-1 ceph-mon[79770]: pgmap v1172: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:20:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:20:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:09.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:09 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:09 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:09.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:10 compute-1 ceph-mon[79770]: pgmap v1173: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:11 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:11.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:11 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:11.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:20:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:20:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:20:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:20:12 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:20:12 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Cumulative writes: 6652 writes, 35K keys, 6652 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s
                                           Cumulative WAL: 6652 writes, 6652 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1546 writes, 8111 keys, 1546 commit groups, 1.0 writes per commit group, ingest: 18.02 MB, 0.03 MB/s
                                           Interval WAL: 1546 writes, 1546 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    121.4      0.41              0.18        18    0.023       0      0       0.0       0.0
                                             L6      1/0   14.44 MB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   4.5    137.6    119.0      1.92              0.69        17    0.113     94K   9317       0.0       0.0
                                            Sum      1/0   14.44 MB   0.0      0.3     0.0      0.2       0.3      0.1       0.0   5.5    113.2    119.4      2.33              0.87        35    0.067     94K   9317       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.9    165.6    169.8      0.41              0.18         8    0.052     26K   2592       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.3     0.0      0.2       0.2      0.0       0.0   0.0    137.6    119.0      1.92              0.69        17    0.113     94K   9317       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    122.0      0.41              0.18        17    0.024       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.049, interval 0.012
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.27 GB write, 0.12 MB/s write, 0.26 GB read, 0.11 MB/s read, 2.3 seconds
                                           Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.11 MB/s read, 0.4 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fbbecff350#2 capacity: 304.00 MB usage: 22.89 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000468 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1382,22.14 MB,7.28405%) FilterBlock(35,279.92 KB,0.0899214%) IndexBlock(35,484.39 KB,0.155605%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Dec 06 10:20:13 compute-1 ceph-mon[79770]: pgmap v1174: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:13.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:13 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:13 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:13.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:15 compute-1 ceph-mon[79770]: pgmap v1175: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:20:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:15.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:15 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:15 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:15.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:16 compute-1 sudo[246267]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:20:16 compute-1 sudo[246267]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:20:16 compute-1 sudo[246267]: pam_unix(sudo:session): session closed for user root
Dec 06 10:20:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:20:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:20:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:20:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:20:17 compute-1 ceph-mon[79770]: pgmap v1176: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:17.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:17 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:17 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:17.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:19 compute-1 ceph-mon[79770]: pgmap v1177: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:20:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:20:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:19.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:20:19 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:19 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:19.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:20 compute-1 ceph-mon[79770]: pgmap v1178: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:21.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:21 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:21 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:21.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:20:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:20:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:20:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:20:23 compute-1 ceph-mon[79770]: pgmap v1179: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:23.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:23 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:23 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:23.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:20:25 compute-1 ceph-mon[79770]: pgmap v1180: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:20:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:25 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:25 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:25.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:25.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:26 compute-1 ceph-mon[79770]: pgmap v1181: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:20:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:20:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:20:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:20:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:27.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:27.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:29 compute-1 ceph-mon[79770]: pgmap v1182: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:20:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:29.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:29 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:29 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:29.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:30 compute-1 sudo[246300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:20:30 compute-1 sudo[246300]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:20:30 compute-1 sudo[246300]: pam_unix(sudo:session): session closed for user root
Dec 06 10:20:30 compute-1 sudo[246325]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:20:30 compute-1 sudo[246325]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:20:31 compute-1 ceph-mon[79770]: pgmap v1183: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:20:31 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:20:31 compute-1 sudo[246325]: pam_unix(sudo:session): session closed for user root
Dec 06 10:20:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:31.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:31.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:31 compute-1 podman[246381]: 2025-12-06 10:20:31.802025907 +0000 UTC m=+0.097355791 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:20:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:20:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:20:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:20:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:20:32 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:20:32 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:20:32 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:20:32 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:20:32 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:20:32 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:20:32 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:20:33 compute-1 ceph-mon[79770]: pgmap v1184: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:33.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:20:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:33.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:20:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:35 compute-1 ceph-mon[79770]: pgmap v1185: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:20:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:35.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:35 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:35 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:35.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:35 compute-1 podman[246410]: 2025-12-06 10:20:35.77193375 +0000 UTC m=+0.070425085 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:20:36 compute-1 ceph-mon[79770]: pgmap v1186: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:36 compute-1 sudo[246431]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:20:36 compute-1 sudo[246431]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:20:36 compute-1 sudo[246431]: pam_unix(sudo:session): session closed for user root
Dec 06 10:20:36 compute-1 sudo[246456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:20:36 compute-1 sudo[246456]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:20:36 compute-1 sudo[246456]: pam_unix(sudo:session): session closed for user root
Dec 06 10:20:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:20:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:20:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:20:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:20:37 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:20:37 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:20:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:37.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:37 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:37 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:37.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:38 compute-1 ceph-mon[79770]: pgmap v1187: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:20:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:39.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:39 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:39 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:39.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:39 compute-1 podman[246482]: 2025-12-06 10:20:39.777117837 +0000 UTC m=+0.088309148 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec 06 10:20:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:20:40 compute-1 ceph-mon[79770]: pgmap v1188: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:41.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:41 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:41 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:41.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:20:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:20:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:20:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:20:43 compute-1 ceph-mon[79770]: pgmap v1189: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:43.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:43 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:43 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:43.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:45 compute-1 ceph-mon[79770]: pgmap v1190: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:20:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:45.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:45.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:46 compute-1 ceph-mon[79770]: pgmap v1191: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/4190015806' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:20:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/4190015806' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:20:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:20:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:20:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:20:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:20:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:47 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:47.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:47 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:47.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:48 compute-1 nova_compute[228576]: 2025-12-06 10:20:48.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:48 compute-1 nova_compute[228576]: 2025-12-06 10:20:48.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:20:48 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:49 compute-1 ceph-mon[79770]: pgmap v1192: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:20:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:49.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:49.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:50 compute-1 nova_compute[228576]: 2025-12-06 10:20:50.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:50 compute-1 nova_compute[228576]: 2025-12-06 10:20:50.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:50 compute-1 nova_compute[228576]: 2025-12-06 10:20:50.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:50 compute-1 nova_compute[228576]: 2025-12-06 10:20:50.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:20:50 compute-1 nova_compute[228576]: 2025-12-06 10:20:50.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:51 compute-1 ceph-mon[79770]: pgmap v1193: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:51 compute-1 nova_compute[228576]: 2025-12-06 10:20:51.481 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:51 compute-1 nova_compute[228576]: 2025-12-06 10:20:51.501 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:20:51 compute-1 nova_compute[228576]: 2025-12-06 10:20:51.501 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:20:51 compute-1 nova_compute[228576]: 2025-12-06 10:20:51.502 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:20:51 compute-1 nova_compute[228576]: 2025-12-06 10:20:51.502 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:20:51 compute-1 nova_compute[228576]: 2025-12-06 10:20:51.502 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:20:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:51.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:51.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:51 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:20:51 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3996973664' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:20:51 compute-1 nova_compute[228576]: 2025-12-06 10:20:51.953 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:20:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:20:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:20:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:20:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:20:52 compute-1 nova_compute[228576]: 2025-12-06 10:20:52.108 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:20:52 compute-1 nova_compute[228576]: 2025-12-06 10:20:52.109 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5189MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:20:52 compute-1 nova_compute[228576]: 2025-12-06 10:20:52.110 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:20:52 compute-1 nova_compute[228576]: 2025-12-06 10:20:52.110 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:20:52 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1912669683' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:20:52 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3996973664' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:20:52 compute-1 nova_compute[228576]: 2025-12-06 10:20:52.404 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:20:52 compute-1 nova_compute[228576]: 2025-12-06 10:20:52.405 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:20:52 compute-1 nova_compute[228576]: 2025-12-06 10:20:52.463 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing inventories for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:20:52 compute-1 nova_compute[228576]: 2025-12-06 10:20:52.517 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating ProviderTree inventory for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:20:52 compute-1 nova_compute[228576]: 2025-12-06 10:20:52.518 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating inventory in ProviderTree for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:20:52 compute-1 nova_compute[228576]: 2025-12-06 10:20:52.537 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing aggregate associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:20:52 compute-1 nova_compute[228576]: 2025-12-06 10:20:52.556 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing trait associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, traits: COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AESNI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SHA,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_BMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:20:52 compute-1 nova_compute[228576]: 2025-12-06 10:20:52.573 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:20:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:20:52 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/225600293' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:20:53 compute-1 nova_compute[228576]: 2025-12-06 10:20:53.001 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:20:53 compute-1 nova_compute[228576]: 2025-12-06 10:20:53.007 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:20:53 compute-1 nova_compute[228576]: 2025-12-06 10:20:53.163 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:20:53 compute-1 nova_compute[228576]: 2025-12-06 10:20:53.165 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:20:53 compute-1 nova_compute[228576]: 2025-12-06 10:20:53.165 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.055s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:20:53 compute-1 ceph-mon[79770]: pgmap v1194: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:53 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2232808873' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:20:53 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/225600293' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:20:53 compute-1 nova_compute[228576]: 2025-12-06 10:20:53.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:53 compute-1 nova_compute[228576]: 2025-12-06 10:20:53.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:53 compute-1 nova_compute[228576]: 2025-12-06 10:20:53.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:20:53 compute-1 nova_compute[228576]: 2025-12-06 10:20:53.493 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:20:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:53 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:53 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:53.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:53.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:20:54 compute-1 ceph-mon[79770]: pgmap v1195: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:20:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:20:54.296 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:20:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:20:54.296 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:20:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:20:54.297 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:20:54 compute-1 nova_compute[228576]: 2025-12-06 10:20:54.492 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:54 compute-1 nova_compute[228576]: 2025-12-06 10:20:54.492 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:55 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:55.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:55 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:55.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:56 compute-1 nova_compute[228576]: 2025-12-06 10:20:56.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:56 compute-1 sudo[246555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:20:56 compute-1 sudo[246555]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:20:56 compute-1 sudo[246555]: pam_unix(sudo:session): session closed for user root
Dec 06 10:20:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:20:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:20:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:20:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:20:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:20:57 compute-1 ceph-mon[79770]: pgmap v1196: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:20:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:57.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:20:57 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:20:57 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:57.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:20:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:20:59 compute-1 ceph-mon[79770]: pgmap v1197: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:20:59 compute-1 nova_compute[228576]: 2025-12-06 10:20:59.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:59 compute-1 nova_compute[228576]: 2025-12-06 10:20:59.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:20:59 compute-1 nova_compute[228576]: 2025-12-06 10:20:59.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:20:59 compute-1 nova_compute[228576]: 2025-12-06 10:20:59.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:20:59 compute-1 nova_compute[228576]: 2025-12-06 10:20:59.501 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:20:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:20:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:20:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:20:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:20:59.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:20:59 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:20:59 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:20:59.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:00 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4133591868' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:21:00 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2168834215' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:21:01 compute-1 ceph-mon[79770]: pgmap v1198: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:01.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:01.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:21:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:21:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:21:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:21:02 compute-1 ceph-mon[79770]: pgmap v1199: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:02 compute-1 podman[246583]: 2025-12-06 10:21:02.79504105 +0000 UTC m=+0.101466224 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:21:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:03.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:21:03 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:03 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:03.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:03 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:05 compute-1 ceph-mon[79770]: pgmap v1200: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:21:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:21:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:05.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:05 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:05 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:05.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:06 compute-1 podman[246611]: 2025-12-06 10:21:06.766274755 +0000 UTC m=+0.076432443 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 10:21:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:21:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:21:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:21:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:21:07 compute-1 ceph-mon[79770]: pgmap v1201: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.130300) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467130510, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2726, "num_deletes": 508, "total_data_size": 6302447, "memory_usage": 6389056, "flush_reason": "Manual Compaction"}
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467152814, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 4104756, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33801, "largest_seqno": 36522, "table_properties": {"data_size": 4093704, "index_size": 6586, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3397, "raw_key_size": 27198, "raw_average_key_size": 20, "raw_value_size": 4069181, "raw_average_value_size": 3023, "num_data_blocks": 283, "num_entries": 1346, "num_filter_entries": 1346, "num_deletions": 508, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016261, "oldest_key_time": 1765016261, "file_creation_time": 1765016467, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 22554 microseconds, and 12195 cpu microseconds.
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.152902) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 4104756 bytes OK
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.152937) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.154361) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.154383) EVENT_LOG_v1 {"time_micros": 1765016467154378, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.154400) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 6289133, prev total WAL file size 6289133, number of live WAL files 2.
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.156539) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(4008KB)], [63(14MB)]
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467156746, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 19241345, "oldest_snapshot_seqno": -1}
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6841 keys, 17003965 bytes, temperature: kUnknown
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467261596, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 17003965, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16956293, "index_size": 29448, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 176695, "raw_average_key_size": 25, "raw_value_size": 16831415, "raw_average_value_size": 2460, "num_data_blocks": 1183, "num_entries": 6841, "num_filter_entries": 6841, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765016467, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.261932) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 17003965 bytes
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.263497) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 183.4 rd, 162.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 14.4 +0.0 blob) out(16.2 +0.0 blob), read-write-amplify(8.8) write-amplify(4.1) OK, records in: 7876, records dropped: 1035 output_compression: NoCompression
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.263518) EVENT_LOG_v1 {"time_micros": 1765016467263508, "job": 38, "event": "compaction_finished", "compaction_time_micros": 104932, "compaction_time_cpu_micros": 63659, "output_level": 6, "num_output_files": 1, "total_output_size": 17003965, "num_input_records": 7876, "num_output_records": 6841, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467264345, "job": 38, "event": "table_file_deletion", "file_number": 65}
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016467267075, "job": 38, "event": "table_file_deletion", "file_number": 63}
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.156353) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.267219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.267226) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.267228) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.267230) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:21:07 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:07.267232) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:21:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:21:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:07.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:07 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:07 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:07.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:08 compute-1 ceph-mon[79770]: pgmap v1202: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:21:08 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:21:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:09.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:21:09 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:09 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:09.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:10 compute-1 ceph-mon[79770]: pgmap v1203: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:10 compute-1 podman[246633]: 2025-12-06 10:21:10.761319292 +0000 UTC m=+0.065958964 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 06 10:21:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:21:11 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:11 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:11.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:11.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:21:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:21:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:21:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:21:13 compute-1 ceph-mon[79770]: pgmap v1204: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:21:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:13 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:13.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:13 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:13.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:13 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:15 compute-1 ceph-mon[79770]: pgmap v1205: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:21:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:15.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:21:15 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:15 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:15.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:16 compute-1 sudo[246657]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:21:16 compute-1 sudo[246657]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:21:16 compute-1 sudo[246657]: pam_unix(sudo:session): session closed for user root
Dec 06 10:21:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:21:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:21:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:21:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:21:17 compute-1 ceph-mon[79770]: pgmap v1206: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:17.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:17.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:18 compute-1 ceph-mon[79770]: pgmap v1207: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:21:18 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:19.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:19.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.029445) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480029485, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 362, "num_deletes": 251, "total_data_size": 331044, "memory_usage": 338048, "flush_reason": "Manual Compaction"}
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480033526, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 217819, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36527, "largest_seqno": 36884, "table_properties": {"data_size": 215646, "index_size": 337, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5932, "raw_average_key_size": 20, "raw_value_size": 211357, "raw_average_value_size": 721, "num_data_blocks": 15, "num_entries": 293, "num_filter_entries": 293, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016468, "oldest_key_time": 1765016468, "file_creation_time": 1765016480, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 4132 microseconds, and 1430 cpu microseconds.
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.033581) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 217819 bytes OK
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.033595) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.034617) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.034635) EVENT_LOG_v1 {"time_micros": 1765016480034630, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.034655) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 328607, prev total WAL file size 328607, number of live WAL files 2.
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.035326) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303033' seq:72057594037927935, type:22 .. '6D6772737461740031323535' seq:0, type:0; will stop at (end)
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(212KB)], [66(16MB)]
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480035408, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 17221784, "oldest_snapshot_seqno": -1}
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6624 keys, 13145244 bytes, temperature: kUnknown
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480097435, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 13145244, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13103839, "index_size": 23757, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16581, "raw_key_size": 172367, "raw_average_key_size": 26, "raw_value_size": 12987337, "raw_average_value_size": 1960, "num_data_blocks": 945, "num_entries": 6624, "num_filter_entries": 6624, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765016480, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.097730) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 13145244 bytes
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.099125) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 277.2 rd, 211.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 16.2 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(139.4) write-amplify(60.3) OK, records in: 7134, records dropped: 510 output_compression: NoCompression
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.099169) EVENT_LOG_v1 {"time_micros": 1765016480099158, "job": 40, "event": "compaction_finished", "compaction_time_micros": 62119, "compaction_time_cpu_micros": 30156, "output_level": 6, "num_output_files": 1, "total_output_size": 13145244, "num_input_records": 7134, "num_output_records": 6624, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480099383, "job": 40, "event": "table_file_deletion", "file_number": 68}
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016480103628, "job": 40, "event": "table_file_deletion", "file_number": 66}
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.035174) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.103785) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.103792) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.103794) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.103796) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:21:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:21:20.103797) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:21:21 compute-1 ceph-mon[79770]: pgmap v1208: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:21.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:21.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:21:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:21:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:21:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:21:23 compute-1 ceph-mon[79770]: pgmap v1209: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:21:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:23.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:21:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:23.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:23 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:21:25 compute-1 ceph-mon[79770]: pgmap v1210: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:21:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:25.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:25.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:26 compute-1 ceph-mon[79770]: pgmap v1211: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:21:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:21:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:21:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:21:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:27.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:27.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:29 compute-1 ceph-mon[79770]: pgmap v1212: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:21:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:29.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:29.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:31 compute-1 ceph-mon[79770]: pgmap v1213: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:31.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:31.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:21:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:21:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:21:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:21:32 compute-1 ceph-mon[79770]: pgmap v1214: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:33.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:33.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:33 compute-1 podman[246690]: 2025-12-06 10:21:33.818174169 +0000 UTC m=+0.122261488 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:21:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:34 compute-1 nova_compute[228576]: 2025-12-06 10:21:34.481 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:35 compute-1 ceph-mon[79770]: pgmap v1215: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:21:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:35.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:35.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:36 compute-1 sudo[246719]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:21:36 compute-1 sudo[246719]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:21:36 compute-1 sudo[246719]: pam_unix(sudo:session): session closed for user root
Dec 06 10:21:36 compute-1 sudo[246732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:21:36 compute-1 sudo[246732]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:21:36 compute-1 sudo[246732]: pam_unix(sudo:session): session closed for user root
Dec 06 10:21:36 compute-1 sudo[246768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:21:36 compute-1 sudo[246768]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:21:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:21:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:21:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:21:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:21:37 compute-1 ceph-mon[79770]: pgmap v1216: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:37 compute-1 sudo[246768]: pam_unix(sudo:session): session closed for user root
Dec 06 10:21:37 compute-1 podman[246826]: 2025-12-06 10:21:37.750307456 +0000 UTC m=+0.057714080 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 06 10:21:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:37.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:37.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:38 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:21:38 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:21:38 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:21:38 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:21:38 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:21:38 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:21:38 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:21:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:39 compute-1 ceph-mon[79770]: pgmap v1217: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:21:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:21:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:39.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:39.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:40 compute-1 ceph-mon[79770]: pgmap v1218: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:41 compute-1 podman[246848]: 2025-12-06 10:21:41.755606075 +0000 UTC m=+0.065081272 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Dec 06 10:21:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:41.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:41.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:21:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:21:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:21:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:21:42 compute-1 sudo[246870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:21:42 compute-1 sudo[246870]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:21:42 compute-1 sudo[246870]: pam_unix(sudo:session): session closed for user root
Dec 06 10:21:43 compute-1 ceph-mon[79770]: pgmap v1219: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:43 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:21:43 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:21:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:43.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:43.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:44 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:45 compute-1 ceph-mon[79770]: pgmap v1220: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:21:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:45.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:45.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/2229648088' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:21:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/2229648088' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:21:46 compute-1 ceph-mon[79770]: pgmap v1221: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:21:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:21:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:21:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:21:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:47.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:47.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:49 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:49 compute-1 ceph-mon[79770]: pgmap v1222: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:21:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:21:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:49.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:21:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:49.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:51 compute-1 ceph-mon[79770]: pgmap v1223: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:51.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:51.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:21:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:21:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:21:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:21:52 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3012262960' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:21:52 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/514334911' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:21:52 compute-1 nova_compute[228576]: 2025-12-06 10:21:52.469 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:52 compute-1 nova_compute[228576]: 2025-12-06 10:21:52.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:52 compute-1 nova_compute[228576]: 2025-12-06 10:21:52.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:52 compute-1 nova_compute[228576]: 2025-12-06 10:21:52.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:21:52 compute-1 nova_compute[228576]: 2025-12-06 10:21:52.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:52 compute-1 nova_compute[228576]: 2025-12-06 10:21:52.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:21:52 compute-1 nova_compute[228576]: 2025-12-06 10:21:52.497 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:21:52 compute-1 nova_compute[228576]: 2025-12-06 10:21:52.497 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:21:52 compute-1 nova_compute[228576]: 2025-12-06 10:21:52.497 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:21:52 compute-1 nova_compute[228576]: 2025-12-06 10:21:52.497 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:21:52 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:21:52 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3576677501' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:21:52 compute-1 nova_compute[228576]: 2025-12-06 10:21:52.937 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:21:53 compute-1 nova_compute[228576]: 2025-12-06 10:21:53.098 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:21:53 compute-1 nova_compute[228576]: 2025-12-06 10:21:53.100 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5173MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:21:53 compute-1 nova_compute[228576]: 2025-12-06 10:21:53.100 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:21:53 compute-1 nova_compute[228576]: 2025-12-06 10:21:53.101 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:21:53 compute-1 ceph-mon[79770]: pgmap v1224: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:53 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3576677501' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:21:53 compute-1 nova_compute[228576]: 2025-12-06 10:21:53.188 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:21:53 compute-1 nova_compute[228576]: 2025-12-06 10:21:53.189 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:21:53 compute-1 nova_compute[228576]: 2025-12-06 10:21:53.219 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:21:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:21:53 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1265861855' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:21:53 compute-1 nova_compute[228576]: 2025-12-06 10:21:53.640 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:21:53 compute-1 nova_compute[228576]: 2025-12-06 10:21:53.645 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:21:53 compute-1 nova_compute[228576]: 2025-12-06 10:21:53.661 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:21:53 compute-1 nova_compute[228576]: 2025-12-06 10:21:53.663 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:21:53 compute-1 nova_compute[228576]: 2025-12-06 10:21:53.663 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.562s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:21:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:53.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:53.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:54 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:54 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1265861855' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:21:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:21:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:21:54.297 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:21:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:21:54.297 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:21:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:21:54.297 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:21:55 compute-1 ceph-mon[79770]: pgmap v1225: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:21:55 compute-1 nova_compute[228576]: 2025-12-06 10:21:55.664 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:55 compute-1 nova_compute[228576]: 2025-12-06 10:21:55.664 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:55.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:55.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:56 compute-1 ceph-mon[79770]: pgmap v1226: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:21:56 compute-1 nova_compute[228576]: 2025-12-06 10:21:56.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:56 compute-1 sudo[246946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:21:56 compute-1 sudo[246946]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:21:56 compute-1 sudo[246946]: pam_unix(sudo:session): session closed for user root
Dec 06 10:21:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:21:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:21:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:21:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:21:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:21:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:57.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:21:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:57.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:21:59 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:21:59 compute-1 nova_compute[228576]: 2025-12-06 10:21:59.463 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:59 compute-1 nova_compute[228576]: 2025-12-06 10:21:59.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:21:59 compute-1 nova_compute[228576]: 2025-12-06 10:21:59.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:21:59 compute-1 nova_compute[228576]: 2025-12-06 10:21:59.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:21:59 compute-1 ceph-mon[79770]: pgmap v1227: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:21:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:21:59.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:21:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:21:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:21:59.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:21:59 compute-1 nova_compute[228576]: 2025-12-06 10:21:59.966 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:22:00 compute-1 ceph-mon[79770]: pgmap v1228: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:22:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:01.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:01 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:01 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:01.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:22:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:22:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:22:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:22:02 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3408085492' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:22:03 compute-1 ceph-mon[79770]: pgmap v1229: 337 pgs: 337 active+clean; 41 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:22:03 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3417508859' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:22:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:03.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:03 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:03 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:03.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:04 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:04 compute-1 ceph-mon[79770]: pgmap v1230: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 0 B/s wr, 47 op/s
Dec 06 10:22:04 compute-1 podman[246975]: 2025-12-06 10:22:04.812237025 +0000 UTC m=+0.122070415 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Dec 06 10:22:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:05.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:05 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:05 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:05.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:22:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:22:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:22:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:22:07 compute-1 ceph-mon[79770]: pgmap v1231: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 28 KiB/s rd, 0 B/s wr, 46 op/s
Dec 06 10:22:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:07.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:07 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:07 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:07.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:08 compute-1 ceph-mon[79770]: pgmap v1232: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 83 KiB/s rd, 0 B/s wr, 138 op/s
Dec 06 10:22:08 compute-1 podman[247003]: 2025-12-06 10:22:08.757196846 +0000 UTC m=+0.059276584 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 06 10:22:09 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:22:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:09.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:09 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:09 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:09.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:10 compute-1 ceph-mon[79770]: pgmap v1233: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 83 KiB/s rd, 0 B/s wr, 137 op/s
Dec 06 10:22:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:11.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:11 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:11 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:11.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:22:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:22:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:22:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:22:12 compute-1 podman[247026]: 2025-12-06 10:22:12.765971663 +0000 UTC m=+0.062910695 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:22:13 compute-1 ceph-mon[79770]: pgmap v1234: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 83 KiB/s rd, 0 B/s wr, 137 op/s
Dec 06 10:22:13 compute-1 nova_compute[228576]: 2025-12-06 10:22:13.537 228580 DEBUG oslo_concurrency.processutils [None req-1b326720-1719-4a67-9e7f-ab0eb7cb97ad bcb29c3303b24519a22c267aaed79458 3e0ab101ca7547d4a515169a0f2edef3 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:22:13 compute-1 nova_compute[228576]: 2025-12-06 10:22:13.570 228580 DEBUG oslo_concurrency.processutils [None req-1b326720-1719-4a67-9e7f-ab0eb7cb97ad bcb29c3303b24519a22c267aaed79458 3e0ab101ca7547d4a515169a0f2edef3 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:22:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:13.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:13 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:13 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:13.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:14 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:14 compute-1 ceph-mon[79770]: pgmap v1235: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 83 KiB/s rd, 0 B/s wr, 138 op/s
Dec 06 10:22:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:15.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:15.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:16 compute-1 sudo[247049]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:22:16 compute-1 sudo[247049]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:22:16 compute-1 sudo[247049]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:22:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:22:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:22:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:22:17 compute-1 ceph-mon[79770]: pgmap v1236: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 55 KiB/s rd, 0 B/s wr, 91 op/s
Dec 06 10:22:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:17.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:17.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:18 compute-1 ceph-mon[79770]: pgmap v1237: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 55 KiB/s rd, 0 B/s wr, 91 op/s
Dec 06 10:22:19 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:19 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:22:19.801 141446 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '9a:dc:0d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'b6:0a:c4:b8:be:39'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 06 10:22:19 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:22:19.803 141446 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 06 10:22:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:19 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:19 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:19.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:19.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:21 compute-1 ceph-mon[79770]: pgmap v1238: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:22:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:21 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:21 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:21.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:21.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:22:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:22:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:22:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:22:23 compute-1 ceph-mon[79770]: pgmap v1239: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:22:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:23 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:23.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:23 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:23.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:24 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:22:25 compute-1 ceph-mon[79770]: pgmap v1240: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:22:25 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:22:25.806 141446 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=61eba479-a995-4b31-88b9-8ebfcea9907e, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 06 10:22:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:25.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:25.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:26 compute-1 ceph-mon[79770]: pgmap v1241: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:22:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:22:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:22:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:22:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:22:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:27 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:27 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:27.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:27.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:29 compute-1 ceph-mon[79770]: pgmap v1242: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:22:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:29 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:29.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:29 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:29.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:31 compute-1 ceph-mon[79770]: pgmap v1243: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:22:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:31.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:31 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:31 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:31.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:22:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:22:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:22:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:22:33 compute-1 ceph-mon[79770]: pgmap v1244: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:22:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:33.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:33.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:34 compute-1 ceph-mon[79770]: pgmap v1245: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:22:35 compute-1 podman[247083]: 2025-12-06 10:22:35.770542018 +0000 UTC m=+0.077914837 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:22:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:35.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:35 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:35 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:35.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:36 compute-1 sudo[247110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:22:36 compute-1 sudo[247110]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:22:36 compute-1 sudo[247110]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:22:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:22:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:22:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:22:37 compute-1 ceph-mon[79770]: pgmap v1246: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:22:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:37 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:37.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:37 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:37.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:39 compute-1 ceph-mon[79770]: pgmap v1247: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:22:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:22:39 compute-1 podman[247136]: 2025-12-06 10:22:39.778879635 +0000 UTC m=+0.083322062 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 06 10:22:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:39.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:39 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:39 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:39.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:41 compute-1 ceph-mon[79770]: pgmap v1248: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:22:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:41 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:41 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:41.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:41.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:22:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:22:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:22:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:22:42 compute-1 ceph-mon[79770]: pgmap v1249: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:22:42 compute-1 sudo[247159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:22:42 compute-1 sudo[247159]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:22:42 compute-1 sudo[247159]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:42 compute-1 sudo[247184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:22:42 compute-1 sudo[247184]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:22:42 compute-1 podman[247208]: 2025-12-06 10:22:42.869284728 +0000 UTC m=+0.064768891 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec 06 10:22:43 compute-1 sudo[247184]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:43 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:22:43 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:22:43 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:22:43 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:22:43 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:22:43 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:22:43 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:22:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:43.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:43.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:44 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:44 compute-1 ceph-mon[79770]: pgmap v1250: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:22:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:45.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:45 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:45 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:45.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/142882542' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:22:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/142882542' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:22:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:22:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:22:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:22:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:22:47 compute-1 ceph-mon[79770]: pgmap v1251: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:22:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:22:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:47.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:47 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:47 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:47.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:48 compute-1 sudo[247264]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:22:48 compute-1 sudo[247264]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:22:48 compute-1 sudo[247264]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:48 compute-1 ceph-mon[79770]: pgmap v1252: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:22:48 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:22:48 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:22:49 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:49.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:49.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:51 compute-1 ceph-mon[79770]: pgmap v1253: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:22:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:51.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:51.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:22:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:22:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:22:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:22:52 compute-1 nova_compute[228576]: 2025-12-06 10:22:52.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:53 compute-1 ceph-mon[79770]: pgmap v1254: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:22:53 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2784602875' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:22:53 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/605113535' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:22:53 compute-1 nova_compute[228576]: 2025-12-06 10:22:53.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:53 compute-1 nova_compute[228576]: 2025-12-06 10:22:53.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:53 compute-1 nova_compute[228576]: 2025-12-06 10:22:53.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:22:53 compute-1 nova_compute[228576]: 2025-12-06 10:22:53.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:53 compute-1 nova_compute[228576]: 2025-12-06 10:22:53.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:22:53 compute-1 nova_compute[228576]: 2025-12-06 10:22:53.496 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:22:53 compute-1 nova_compute[228576]: 2025-12-06 10:22:53.497 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:22:53 compute-1 nova_compute[228576]: 2025-12-06 10:22:53.497 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:22:53 compute-1 nova_compute[228576]: 2025-12-06 10:22:53.497 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:22:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:53.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:53.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:53 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:22:53 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1314528910' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:22:53 compute-1 nova_compute[228576]: 2025-12-06 10:22:53.974 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:22:54 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:54 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1314528910' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:22:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:22:54 compute-1 nova_compute[228576]: 2025-12-06 10:22:54.186 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:22:54 compute-1 nova_compute[228576]: 2025-12-06 10:22:54.188 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5169MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:22:54 compute-1 nova_compute[228576]: 2025-12-06 10:22:54.188 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:22:54 compute-1 nova_compute[228576]: 2025-12-06 10:22:54.188 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:22:54 compute-1 nova_compute[228576]: 2025-12-06 10:22:54.272 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:22:54 compute-1 nova_compute[228576]: 2025-12-06 10:22:54.272 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:22:54 compute-1 nova_compute[228576]: 2025-12-06 10:22:54.289 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:22:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:22:54.298 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:22:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:22:54.300 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:22:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:22:54.300 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:22:54 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:22:54 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3977870626' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:22:54 compute-1 nova_compute[228576]: 2025-12-06 10:22:54.757 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:22:54 compute-1 nova_compute[228576]: 2025-12-06 10:22:54.763 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:22:54 compute-1 nova_compute[228576]: 2025-12-06 10:22:54.786 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:22:54 compute-1 nova_compute[228576]: 2025-12-06 10:22:54.789 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:22:54 compute-1 nova_compute[228576]: 2025-12-06 10:22:54.789 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:22:55 compute-1 ceph-mon[79770]: pgmap v1255: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:22:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3977870626' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:22:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:22:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:55.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:22:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:55.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:56 compute-1 ceph-mon[79770]: pgmap v1256: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:22:56 compute-1 nova_compute[228576]: 2025-12-06 10:22:56.789 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:56 compute-1 nova_compute[228576]: 2025-12-06 10:22:56.789 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:22:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:22:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:22:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:22:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:22:57 compute-1 sudo[247337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:22:57 compute-1 sudo[247337]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:22:57 compute-1 sudo[247337]: pam_unix(sudo:session): session closed for user root
Dec 06 10:22:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:57.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:57 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:57 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:57 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:57.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:22:58 compute-1 nova_compute[228576]: 2025-12-06 10:22:58.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:22:59 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:22:59 compute-1 ceph-mon[79770]: pgmap v1257: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:22:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:22:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:22:59.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:22:59 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:22:59 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:22:59 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:22:59.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:00 compute-1 nova_compute[228576]: 2025-12-06 10:23:00.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:01 compute-1 ceph-mon[79770]: pgmap v1258: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:01 compute-1 nova_compute[228576]: 2025-12-06 10:23:01.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:01 compute-1 nova_compute[228576]: 2025-12-06 10:23:01.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:01 compute-1 nova_compute[228576]: 2025-12-06 10:23:01.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:23:01 compute-1 nova_compute[228576]: 2025-12-06 10:23:01.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:23:01 compute-1 nova_compute[228576]: 2025-12-06 10:23:01.507 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:23:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:01.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:01 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:01 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:23:01 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:01.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:23:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:23:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:23:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:23:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:23:02 compute-1 ceph-mon[79770]: pgmap v1259: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:03 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2126492916' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:23:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:03.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:03 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:03 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:03 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:03.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:04 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:04 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1293102009' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:23:04 compute-1 ceph-mon[79770]: pgmap v1260: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:23:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:23:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:05.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:23:05 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:05 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:23:05 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:05.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:23:06 compute-1 podman[247367]: 2025-12-06 10:23:06.829349952 +0000 UTC m=+0.129448239 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:23:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:23:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:23:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:23:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:23:07 compute-1 ceph-mon[79770]: pgmap v1261: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:07 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:07 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:07 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:07.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:07 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:23:07 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:07.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:23:09 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:09 compute-1 ceph-mon[79770]: pgmap v1262: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:23:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:23:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:09 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:09 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:09 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:09.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:09 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:09 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:09.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:10 compute-1 ceph-mon[79770]: pgmap v1263: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:10 compute-1 podman[247396]: 2025-12-06 10:23:10.772033337 +0000 UTC m=+0.077070566 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 06 10:23:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:11 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:11 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:11.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:11 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:11 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:11 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:11.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:23:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:23:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:23:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:23:13 compute-1 ceph-mon[79770]: pgmap v1264: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:13 compute-1 podman[247416]: 2025-12-06 10:23:13.766168427 +0000 UTC m=+0.067659963 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 06 10:23:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:13 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:13 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:13 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:13.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:13 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:13 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:13.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:14 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:15 compute-1 ceph-mon[79770]: pgmap v1265: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:23:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:15.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:15 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:15 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:15 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:15.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:23:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:23:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:23:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:23:17 compute-1 sudo[247438]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:23:17 compute-1 sudo[247438]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:23:17 compute-1 sudo[247438]: pam_unix(sudo:session): session closed for user root
Dec 06 10:23:17 compute-1 ceph-mon[79770]: pgmap v1266: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:17 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:17 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:17 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:17 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:17.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:17 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:17.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:18 compute-1 ceph-mon[79770]: pgmap v1267: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:23:19 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:19 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:19 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:23:19 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:19.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:23:19 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:19 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:19.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:21 compute-1 ceph-mon[79770]: pgmap v1268: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:21 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:21 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:21 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:21 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:21.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:21 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:21.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:23:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:23:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:23:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:23:22 compute-1 ceph-mon[79770]: pgmap v1269: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:23.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:23 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:23 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:23 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:23.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:24 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:23:25 compute-1 ceph-mon[79770]: pgmap v1270: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:23:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:25.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:25 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:25 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:25 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:25.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:23:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:23:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:23:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:23:27 compute-1 ceph-mon[79770]: pgmap v1271: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:23:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:27.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:23:27 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:27 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:27 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:27.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:29 compute-1 ceph-mon[79770]: pgmap v1272: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:23:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:29.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:29 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:29 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:29 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:29.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:30 compute-1 ceph-mon[79770]: pgmap v1273: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:31.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:31 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:31 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:31 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:31.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:23:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:23:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:23:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:23:33 compute-1 ceph-mon[79770]: pgmap v1274: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:33.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:33 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:33 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:23:33 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:33.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:23:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:35 compute-1 ceph-mon[79770]: pgmap v1275: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:23:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:35 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:35 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:35 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.002000050s ======
Dec 06 10:23:35 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:35.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:35 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:35.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000050s
Dec 06 10:23:36 compute-1 ceph-mon[79770]: pgmap v1276: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:23:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:23:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:23:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:23:37 compute-1 sudo[247473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:23:37 compute-1 sudo[247473]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:23:37 compute-1 sudo[247473]: pam_unix(sudo:session): session closed for user root
Dec 06 10:23:37 compute-1 podman[247497]: 2025-12-06 10:23:37.385678977 +0000 UTC m=+0.101189056 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 06 10:23:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:37 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:37 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:37 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:37.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:37 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:37 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:37.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:39 compute-1 ceph-mon[79770]: pgmap v1277: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:23:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:23:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:39 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:39 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:39 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:39.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:39 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:39 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:39.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:41 compute-1 ceph-mon[79770]: pgmap v1278: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:41 compute-1 podman[247526]: 2025-12-06 10:23:41.74793509 +0000 UTC m=+0.056478854 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:23:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:41 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:41 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:41 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:41.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:41 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:23:41 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:41.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:23:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:23:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:23:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:23:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:23:42 compute-1 ceph-mon[79770]: pgmap v1279: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:43 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:43 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:43 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:43.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:43 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:23:43 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:43.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:23:44 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:44 compute-1 podman[247549]: 2025-12-06 10:23:44.795727042 +0000 UTC m=+0.094425128 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Dec 06 10:23:45 compute-1 ceph-mon[79770]: pgmap v1280: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:23:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:45 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:45 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:23:45 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:23:45 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:45.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:23:45 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:45.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:23:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/1770852714' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:23:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/1770852714' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:23:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:23:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:23:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:23:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:23:47 compute-1 ceph-mon[79770]: pgmap v1281: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:47 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:47 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:47 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:47.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:47 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:47 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:47.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:48 compute-1 sudo[247572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:23:48 compute-1 sudo[247572]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:23:48 compute-1 sudo[247572]: pam_unix(sudo:session): session closed for user root
Dec 06 10:23:48 compute-1 sudo[247597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:23:48 compute-1 sudo[247597]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:23:48 compute-1 ceph-mon[79770]: pgmap v1282: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:23:49 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:49 compute-1 sudo[247597]: pam_unix(sudo:session): session closed for user root
Dec 06 10:23:49 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:23:49 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:23:49 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:23:49 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:23:49 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:23:49 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:23:49 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:23:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:49 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:49 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:49 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:49.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:49 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:49 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:49.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.082204) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630082434, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 1706, "num_deletes": 250, "total_data_size": 4188119, "memory_usage": 4252240, "flush_reason": "Manual Compaction"}
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630106061, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 2747624, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36889, "largest_seqno": 38590, "table_properties": {"data_size": 2740571, "index_size": 4060, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 13926, "raw_average_key_size": 18, "raw_value_size": 2726508, "raw_average_value_size": 3669, "num_data_blocks": 178, "num_entries": 743, "num_filter_entries": 743, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016481, "oldest_key_time": 1765016481, "file_creation_time": 1765016630, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 23956 microseconds, and 13178 cpu microseconds.
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.106182) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 2747624 bytes OK
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.106210) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.107352) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.107370) EVENT_LOG_v1 {"time_micros": 1765016630107365, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.107392) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 4180307, prev total WAL file size 4180307, number of live WAL files 2.
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.109003) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323533' seq:72057594037927935, type:22 .. '6B7600353034' seq:0, type:0; will stop at (end)
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(2683KB)], [69(12MB)]
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630109176, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 15892868, "oldest_snapshot_seqno": -1}
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6853 keys, 14489425 bytes, temperature: kUnknown
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630191424, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 14489425, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14445125, "index_size": 26076, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 178820, "raw_average_key_size": 26, "raw_value_size": 14323181, "raw_average_value_size": 2090, "num_data_blocks": 1032, "num_entries": 6853, "num_filter_entries": 6853, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765016630, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.191670) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 14489425 bytes
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.192819) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 193.1 rd, 176.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 12.5 +0.0 blob) out(13.8 +0.0 blob), read-write-amplify(11.1) write-amplify(5.3) OK, records in: 7367, records dropped: 514 output_compression: NoCompression
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.192838) EVENT_LOG_v1 {"time_micros": 1765016630192829, "job": 42, "event": "compaction_finished", "compaction_time_micros": 82324, "compaction_time_cpu_micros": 44731, "output_level": 6, "num_output_files": 1, "total_output_size": 14489425, "num_input_records": 7367, "num_output_records": 6853, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630193526, "job": 42, "event": "table_file_deletion", "file_number": 71}
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016630195988, "job": 42, "event": "table_file_deletion", "file_number": 69}
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.108876) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.196110) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.196117) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.196118) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.196119) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:23:50 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:23:50.196121) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:23:51 compute-1 ceph-mon[79770]: pgmap v1283: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:51 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:51 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:23:51 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:51.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:23:51 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:23:51 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:51.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:23:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:23:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:23:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:23:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:23:53 compute-1 ceph-mon[79770]: pgmap v1284: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:53 compute-1 nova_compute[228576]: 2025-12-06 10:23:53.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:53.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:53 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:53 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:53 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:53.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:54 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:54 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1252620075' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:23:54 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3964484987' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:23:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:23:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:23:54.299 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:23:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:23:54.299 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:23:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:23:54.299 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:23:54 compute-1 sudo[247656]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:23:54 compute-1 sudo[247656]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:23:54 compute-1 sudo[247656]: pam_unix(sudo:session): session closed for user root
Dec 06 10:23:54 compute-1 nova_compute[228576]: 2025-12-06 10:23:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:54 compute-1 nova_compute[228576]: 2025-12-06 10:23:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:54 compute-1 nova_compute[228576]: 2025-12-06 10:23:54.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:23:54 compute-1 nova_compute[228576]: 2025-12-06 10:23:54.472 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:54 compute-1 nova_compute[228576]: 2025-12-06 10:23:54.506 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:23:54 compute-1 nova_compute[228576]: 2025-12-06 10:23:54.507 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:23:54 compute-1 nova_compute[228576]: 2025-12-06 10:23:54.507 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:23:54 compute-1 nova_compute[228576]: 2025-12-06 10:23:54.507 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:23:54 compute-1 nova_compute[228576]: 2025-12-06 10:23:54.507 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:23:54 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:23:54 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3457887194' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:23:55 compute-1 nova_compute[228576]: 2025-12-06 10:23:54.999 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:23:55 compute-1 nova_compute[228576]: 2025-12-06 10:23:55.157 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:23:55 compute-1 nova_compute[228576]: 2025-12-06 10:23:55.158 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5150MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:23:55 compute-1 nova_compute[228576]: 2025-12-06 10:23:55.159 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:23:55 compute-1 nova_compute[228576]: 2025-12-06 10:23:55.159 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:23:55 compute-1 ceph-mon[79770]: pgmap v1285: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:23:55 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:23:55 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:23:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3457887194' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:23:55 compute-1 nova_compute[228576]: 2025-12-06 10:23:55.210 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:23:55 compute-1 nova_compute[228576]: 2025-12-06 10:23:55.211 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:23:55 compute-1 nova_compute[228576]: 2025-12-06 10:23:55.224 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:23:55 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:23:55 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1175837351' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:23:55 compute-1 nova_compute[228576]: 2025-12-06 10:23:55.667 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:23:55 compute-1 nova_compute[228576]: 2025-12-06 10:23:55.674 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:23:55 compute-1 nova_compute[228576]: 2025-12-06 10:23:55.690 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:23:55 compute-1 nova_compute[228576]: 2025-12-06 10:23:55.691 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:23:55 compute-1 nova_compute[228576]: 2025-12-06 10:23:55.691 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.532s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:23:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:55 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:55 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:23:55 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:55.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:23:55 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:23:55 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:55.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:23:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1175837351' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:23:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:23:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:23:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:23:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:23:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:23:57 compute-1 ceph-mon[79770]: pgmap v1286: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:23:57 compute-1 sudo[247726]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:23:57 compute-1 sudo[247726]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:23:57 compute-1 sudo[247726]: pam_unix(sudo:session): session closed for user root
Dec 06 10:23:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:23:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:23:58 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:23:58 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:23:57.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:23:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:23:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:23:57.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:23:58 compute-1 ceph-mon[79770]: pgmap v1287: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:23:58 compute-1 nova_compute[228576]: 2025-12-06 10:23:58.691 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:58 compute-1 nova_compute[228576]: 2025-12-06 10:23:58.692 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:23:59 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:23:59 compute-1 nova_compute[228576]: 2025-12-06 10:23:59.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:00.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:00.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:01 compute-1 ceph-mon[79770]: pgmap v1288: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:01 compute-1 nova_compute[228576]: 2025-12-06 10:24:01.463 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:01 compute-1 nova_compute[228576]: 2025-12-06 10:24:01.469 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:01 compute-1 nova_compute[228576]: 2025-12-06 10:24:01.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:24:01 compute-1 nova_compute[228576]: 2025-12-06 10:24:01.470 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:24:01 compute-1 nova_compute[228576]: 2025-12-06 10:24:01.487 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:24:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:24:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:24:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:24:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:24:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:24:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:02.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:02 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:24:02 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:02.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:24:03 compute-1 ceph-mon[79770]: pgmap v1289: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:24:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:04 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:04 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:04.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:04.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:04 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:04 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/747380932' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:24:05 compute-1 ceph-mon[79770]: pgmap v1290: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:24:05 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/962126688' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:24:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:24:06 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:24:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:24:06 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:06.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:24:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:06.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:24:06 compute-1 ceph-mon[79770]: pgmap v1291: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:24:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:24:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:24:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:24:07 compute-1 podman[247756]: 2025-12-06 10:24:07.812178153 +0000 UTC m=+0.114363973 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 06 10:24:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:08.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:08.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:09 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:09 compute-1 ceph-mon[79770]: pgmap v1292: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:24:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:24:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:24:10 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:10 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:10.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:10.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:10 compute-1 ceph-mon[79770]: pgmap v1293: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:24:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:24:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:24:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:24:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:24:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:12.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:12 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:24:12 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:12.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:24:12 compute-1 podman[247785]: 2025-12-06 10:24:12.761184791 +0000 UTC m=+0.063141631 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec 06 10:24:13 compute-1 ceph-mon[79770]: pgmap v1294: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:14 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:24:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:24:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:14.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:24:14 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:24:14 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:14.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:24:14 compute-1 ceph-mon[79770]: pgmap v1295: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:24:15 compute-1 podman[247808]: 2025-12-06 10:24:15.774797114 +0000 UTC m=+0.073478748 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec 06 10:24:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:24:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:16.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:16 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:16 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:16.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:24:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:24:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:24:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:24:17 compute-1 ceph-mon[79770]: pgmap v1296: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:17 compute-1 sudo[247829]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:24:17 compute-1 sudo[247829]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:24:17 compute-1 sudo[247829]: pam_unix(sudo:session): session closed for user root
Dec 06 10:24:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:24:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:18.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:18 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:18 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:18.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:19 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:19 compute-1 ceph-mon[79770]: pgmap v1297: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:24:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:24:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:24:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:20.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:24:20 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:20 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:20.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:20 compute-1 ceph-mon[79770]: pgmap v1298: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:24:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:24:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:24:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:24:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:24:22 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:22 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:22.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:22.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:23 compute-1 ceph-mon[79770]: pgmap v1299: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:24 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:24:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:24.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:24:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:24.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:24:25 compute-1 ceph-mon[79770]: pgmap v1300: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:24:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:26.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:24:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:26.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:24:26 compute-1 ceph-mon[79770]: pgmap v1301: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:24:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:24:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:24:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:24:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:28.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:28.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:29 compute-1 ceph-mon[79770]: pgmap v1302: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:24:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:30.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:30.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:30 compute-1 ceph-mon[79770]: pgmap v1303: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:24:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:24:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:24:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:24:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:32.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:32.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:33 compute-1 ceph-mon[79770]: pgmap v1304: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:34.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:34.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:34 compute-1 ceph-mon[79770]: pgmap v1305: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:24:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:36.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:36.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:24:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:24:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:24:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:24:37 compute-1 ceph-mon[79770]: pgmap v1306: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:37 compute-1 sudo[247864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:24:37 compute-1 sudo[247864]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:24:37 compute-1 sudo[247864]: pam_unix(sudo:session): session closed for user root
Dec 06 10:24:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:38.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:38.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:38 compute-1 ceph-mon[79770]: pgmap v1307: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:24:38 compute-1 podman[247890]: 2025-12-06 10:24:38.828827827 +0000 UTC m=+0.130903854 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 06 10:24:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:24:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:40.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:24:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:40.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:24:40 compute-1 ceph-mon[79770]: pgmap v1308: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:24:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:24:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:24:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:24:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:42.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:24:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:42.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:24:43 compute-1 ceph-mon[79770]: pgmap v1309: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.207397) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683207501, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 759, "num_deletes": 251, "total_data_size": 1539291, "memory_usage": 1565312, "flush_reason": "Manual Compaction"}
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683215270, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 1017152, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38595, "largest_seqno": 39349, "table_properties": {"data_size": 1013510, "index_size": 1486, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8250, "raw_average_key_size": 19, "raw_value_size": 1006253, "raw_average_value_size": 2362, "num_data_blocks": 65, "num_entries": 426, "num_filter_entries": 426, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016631, "oldest_key_time": 1765016631, "file_creation_time": 1765016683, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 7895 microseconds, and 3851 cpu microseconds.
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.215303) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 1017152 bytes OK
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.215319) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.216508) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.216532) EVENT_LOG_v1 {"time_micros": 1765016683216517, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.216547) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 1535275, prev total WAL file size 1535275, number of live WAL files 2.
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.217099) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(993KB)], [72(13MB)]
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683217309, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 15506577, "oldest_snapshot_seqno": -1}
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6765 keys, 13325700 bytes, temperature: kUnknown
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683310328, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 13325700, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13283096, "index_size": 24572, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16965, "raw_key_size": 177659, "raw_average_key_size": 26, "raw_value_size": 13163739, "raw_average_value_size": 1945, "num_data_blocks": 963, "num_entries": 6765, "num_filter_entries": 6765, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765016683, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.311186) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 13325700 bytes
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.313000) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 166.3 rd, 142.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 13.8 +0.0 blob) out(12.7 +0.0 blob), read-write-amplify(28.3) write-amplify(13.1) OK, records in: 7279, records dropped: 514 output_compression: NoCompression
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.313039) EVENT_LOG_v1 {"time_micros": 1765016683313021, "job": 44, "event": "compaction_finished", "compaction_time_micros": 93230, "compaction_time_cpu_micros": 56891, "output_level": 6, "num_output_files": 1, "total_output_size": 13325700, "num_input_records": 7279, "num_output_records": 6765, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683313631, "job": 44, "event": "table_file_deletion", "file_number": 74}
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016683318386, "job": 44, "event": "table_file_deletion", "file_number": 72}
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.216988) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.318467) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.318475) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.318478) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.318481) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:43 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:24:43.318484) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:24:43 compute-1 podman[247921]: 2025-12-06 10:24:43.759421957 +0000 UTC m=+0.061919490 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Dec 06 10:24:44 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:44.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:24:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:44.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:24:44 compute-1 ceph-mon[79770]: pgmap v1310: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:24:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:24:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:46.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:46 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:24:46 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:46.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:24:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/120415646' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:24:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/120415646' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:24:46 compute-1 podman[247944]: 2025-12-06 10:24:46.756425656 +0000 UTC m=+0.053211303 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec 06 10:24:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:24:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:24:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:24:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:24:47 compute-1 ceph-mon[79770]: pgmap v1311: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:48.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:48.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:48 compute-1 ceph-mon[79770]: pgmap v1312: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:24:49 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:50.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:24:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:50.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:24:51 compute-1 ceph-mon[79770]: pgmap v1313: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:24:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:24:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:24:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:24:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:52.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:24:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:52.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:24:52 compute-1 ceph-mon[79770]: pgmap v1314: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:54 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:24:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:54.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:54.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:24:54.299 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:24:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:24:54.300 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:24:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:24:54.300 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:24:54 compute-1 nova_compute[228576]: 2025-12-06 10:24:54.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:54 compute-1 nova_compute[228576]: 2025-12-06 10:24:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:54 compute-1 nova_compute[228576]: 2025-12-06 10:24:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:54 compute-1 nova_compute[228576]: 2025-12-06 10:24:54.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:24:54 compute-1 sudo[247969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:24:54 compute-1 sudo[247969]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:24:54 compute-1 sudo[247969]: pam_unix(sudo:session): session closed for user root
Dec 06 10:24:54 compute-1 sudo[247994]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:24:54 compute-1 sudo[247994]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:24:55 compute-1 ceph-mon[79770]: pgmap v1315: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:24:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3597878080' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:24:55 compute-1 sudo[247994]: pam_unix(sudo:session): session closed for user root
Dec 06 10:24:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2258271433' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:24:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Dec 06 10:24:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:24:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:24:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:24:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:24:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:24:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:24:56 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:24:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:56.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:56.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:56 compute-1 nova_compute[228576]: 2025-12-06 10:24:56.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:56 compute-1 nova_compute[228576]: 2025-12-06 10:24:56.501 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:24:56 compute-1 nova_compute[228576]: 2025-12-06 10:24:56.501 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:24:56 compute-1 nova_compute[228576]: 2025-12-06 10:24:56.502 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:24:56 compute-1 nova_compute[228576]: 2025-12-06 10:24:56.502 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:24:56 compute-1 nova_compute[228576]: 2025-12-06 10:24:56.502 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:24:56 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:24:56 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2650253251' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:24:56 compute-1 nova_compute[228576]: 2025-12-06 10:24:56.944 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:24:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:24:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:24:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:24:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:24:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:24:57 compute-1 ceph-mon[79770]: pgmap v1316: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:24:57 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2650253251' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:24:57 compute-1 nova_compute[228576]: 2025-12-06 10:24:57.115 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:24:57 compute-1 nova_compute[228576]: 2025-12-06 10:24:57.116 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5150MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:24:57 compute-1 nova_compute[228576]: 2025-12-06 10:24:57.116 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:24:57 compute-1 nova_compute[228576]: 2025-12-06 10:24:57.116 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:24:57 compute-1 nova_compute[228576]: 2025-12-06 10:24:57.482 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:24:57 compute-1 nova_compute[228576]: 2025-12-06 10:24:57.482 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:24:57 compute-1 nova_compute[228576]: 2025-12-06 10:24:57.503 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:24:57 compute-1 sudo[248074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:24:57 compute-1 sudo[248074]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:24:57 compute-1 sudo[248074]: pam_unix(sudo:session): session closed for user root
Dec 06 10:24:57 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:24:57 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/796350809' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:24:57 compute-1 nova_compute[228576]: 2025-12-06 10:24:57.952 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:24:57 compute-1 nova_compute[228576]: 2025-12-06 10:24:57.959 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:24:57 compute-1 nova_compute[228576]: 2025-12-06 10:24:57.979 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:24:57 compute-1 nova_compute[228576]: 2025-12-06 10:24:57.981 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:24:57 compute-1 nova_compute[228576]: 2025-12-06 10:24:57.981 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.864s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:24:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:24:58.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:24:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:24:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:24:58.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:24:58 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/796350809' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:24:58 compute-1 nova_compute[228576]: 2025-12-06 10:24:58.981 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:59 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:24:59 compute-1 ceph-mon[79770]: pgmap v1317: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:24:59 compute-1 nova_compute[228576]: 2025-12-06 10:24:59.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:24:59 compute-1 nova_compute[228576]: 2025-12-06 10:24:59.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:00.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:00.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:00 compute-1 nova_compute[228576]: 2025-12-06 10:25:00.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:01 compute-1 sudo[248122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:25:01 compute-1 sudo[248122]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:25:01 compute-1 sudo[248122]: pam_unix(sudo:session): session closed for user root
Dec 06 10:25:01 compute-1 ceph-mon[79770]: pgmap v1318: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:25:01 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:25:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:25:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:25:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:25:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:25:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:02.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:02.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:02 compute-1 ceph-mon[79770]: pgmap v1319: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:03 compute-1 nova_compute[228576]: 2025-12-06 10:25:03.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:03 compute-1 nova_compute[228576]: 2025-12-06 10:25:03.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:03 compute-1 nova_compute[228576]: 2025-12-06 10:25:03.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:25:03 compute-1 nova_compute[228576]: 2025-12-06 10:25:03.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:25:03 compute-1 nova_compute[228576]: 2025-12-06 10:25:03.489 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:25:04 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:04.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:04.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:05 compute-1 ceph-mon[79770]: pgmap v1320: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:25:05 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3124201791' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:25:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:06.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:06.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:06 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2298643697' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:25:06 compute-1 ceph-mon[79770]: pgmap v1321: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:25:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:25:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:25:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:25:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:08.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:08.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:09 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:09 compute-1 ceph-mon[79770]: pgmap v1322: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:25:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:25:09 compute-1 podman[248151]: 2025-12-06 10:25:09.824521751 +0000 UTC m=+0.116230539 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec 06 10:25:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:10.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:10.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:10 compute-1 ceph-mon[79770]: pgmap v1323: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:25:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:25:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:25:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:25:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:12.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:12.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:13 compute-1 ceph-mon[79770]: pgmap v1324: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:14 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:14.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:14.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:14 compute-1 ceph-mon[79770]: pgmap v1325: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:25:14 compute-1 podman[248181]: 2025-12-06 10:25:14.764401892 +0000 UTC m=+0.066749970 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec 06 10:25:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:16.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:16.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:25:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:25:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:25:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:25:17 compute-1 ceph-mon[79770]: pgmap v1326: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:17 compute-1 sudo[248202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:25:17 compute-1 sudo[248202]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:25:17 compute-1 sudo[248202]: pam_unix(sudo:session): session closed for user root
Dec 06 10:25:17 compute-1 podman[248206]: 2025-12-06 10:25:17.763264569 +0000 UTC m=+0.068448312 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 06 10:25:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:18.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:18.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:18 compute-1 ceph-mon[79770]: pgmap v1327: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:25:19 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:20.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:20.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:20 compute-1 ceph-mon[79770]: pgmap v1328: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:25:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:25:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:25:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:25:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:22.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:22.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:22 compute-1 ceph-mon[79770]: pgmap v1329: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:24 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:25:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:24.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:24.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:25 compute-1 ceph-mon[79770]: pgmap v1330: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 766 B/s rd, 0 op/s
Dec 06 10:25:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:26.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:26.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:26 compute-1 ceph-mon[79770]: pgmap v1331: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:25:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:25:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:25:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:25:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:28.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:28.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:28 compute-1 ceph-mon[79770]: pgmap v1332: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 766 B/s rd, 0 op/s
Dec 06 10:25:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:30.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:30.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:30 compute-1 ceph-mon[79770]: pgmap v1333: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:25:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:25:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:25:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:25:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:32.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:32.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:32 compute-1 ceph-mon[79770]: pgmap v1334: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:34.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:34.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:34 compute-1 ceph-mon[79770]: pgmap v1335: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:25:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:36.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:36.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:36 compute-1 ceph-mon[79770]: pgmap v1336: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:25:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:25:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:25:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:25:37 compute-1 sudo[248256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:25:37 compute-1 sudo[248256]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:25:37 compute-1 sudo[248256]: pam_unix(sudo:session): session closed for user root
Dec 06 10:25:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:38.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:38.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:38 compute-1 ceph-mon[79770]: pgmap v1337: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:25:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:25:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:25:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:40.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:25:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:40.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:40 compute-1 ceph-mon[79770]: pgmap v1338: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:40 compute-1 podman[248283]: 2025-12-06 10:25:40.856893893 +0000 UTC m=+0.164209102 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Dec 06 10:25:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:25:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:25:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:25:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:25:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:42.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:42.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:42 compute-1 ceph-mon[79770]: pgmap v1339: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:44 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:44.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:44.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:44 compute-1 ceph-mon[79770]: pgmap v1340: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:25:45 compute-1 podman[248310]: 2025-12-06 10:25:45.793881699 +0000 UTC m=+0.092894019 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 06 10:25:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/2814903871' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:25:46 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/2814903871' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:25:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:25:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:46.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:25:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:46.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:25:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:25:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:25:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:25:47 compute-1 ceph-mon[79770]: pgmap v1341: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:48.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:48.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:48 compute-1 ceph-mon[79770]: pgmap v1342: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:25:48 compute-1 podman[248331]: 2025-12-06 10:25:48.784001829 +0000 UTC m=+0.075512978 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 06 10:25:49 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:50.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:50.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:50 compute-1 ceph-mon[79770]: pgmap v1343: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:25:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:25:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:25:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:25:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:52.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:52.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:52 compute-1 ceph-mon[79770]: pgmap v1344: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:54 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:25:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:54.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:54.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:25:54.300 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:25:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:25:54.301 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:25:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:25:54.301 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:25:54 compute-1 nova_compute[228576]: 2025-12-06 10:25:54.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:54 compute-1 nova_compute[228576]: 2025-12-06 10:25:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:54 compute-1 nova_compute[228576]: 2025-12-06 10:25:54.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:25:55 compute-1 ceph-mon[79770]: pgmap v1345: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:25:55 compute-1 nova_compute[228576]: 2025-12-06 10:25:55.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3470316468' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:25:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:56.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:56.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:25:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:25:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:25:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:25:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:25:57 compute-1 ceph-mon[79770]: pgmap v1346: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:25:57 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/261979015' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:25:57 compute-1 nova_compute[228576]: 2025-12-06 10:25:57.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:57 compute-1 nova_compute[228576]: 2025-12-06 10:25:57.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:57 compute-1 nova_compute[228576]: 2025-12-06 10:25:57.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 06 10:25:57 compute-1 nova_compute[228576]: 2025-12-06 10:25:57.490 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:57 compute-1 sudo[248355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:25:57 compute-1 sudo[248355]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:25:57 compute-1 sudo[248355]: pam_unix(sudo:session): session closed for user root
Dec 06 10:25:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:25:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:25:58.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:25:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:25:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:25:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:25:58.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:25:58 compute-1 ceph-mon[79770]: pgmap v1347: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:25:58 compute-1 nova_compute[228576]: 2025-12-06 10:25:58.507 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:25:58 compute-1 nova_compute[228576]: 2025-12-06 10:25:58.534 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:25:58 compute-1 nova_compute[228576]: 2025-12-06 10:25:58.534 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:25:58 compute-1 nova_compute[228576]: 2025-12-06 10:25:58.534 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:25:58 compute-1 nova_compute[228576]: 2025-12-06 10:25:58.534 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:25:58 compute-1 nova_compute[228576]: 2025-12-06 10:25:58.535 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:25:58 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:25:58 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2411554739' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:25:59 compute-1 nova_compute[228576]: 2025-12-06 10:25:59.021 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:25:59 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:25:59 compute-1 nova_compute[228576]: 2025-12-06 10:25:59.179 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:25:59 compute-1 nova_compute[228576]: 2025-12-06 10:25:59.181 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5181MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:25:59 compute-1 nova_compute[228576]: 2025-12-06 10:25:59.181 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:25:59 compute-1 nova_compute[228576]: 2025-12-06 10:25:59.181 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:25:59 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2411554739' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:25:59 compute-1 nova_compute[228576]: 2025-12-06 10:25:59.356 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:25:59 compute-1 nova_compute[228576]: 2025-12-06 10:25:59.357 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:25:59 compute-1 nova_compute[228576]: 2025-12-06 10:25:59.444 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing inventories for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 06 10:25:59 compute-1 nova_compute[228576]: 2025-12-06 10:25:59.528 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating ProviderTree inventory for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 06 10:25:59 compute-1 nova_compute[228576]: 2025-12-06 10:25:59.529 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Updating inventory in ProviderTree for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 06 10:25:59 compute-1 nova_compute[228576]: 2025-12-06 10:25:59.553 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing aggregate associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 06 10:25:59 compute-1 nova_compute[228576]: 2025-12-06 10:25:59.583 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Refreshing trait associations for resource provider ff2f17cb-ff1d-4da7-9560-4be741380cb1, traits: COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AESNI,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AVX,HW_CPU_X86_BMI2,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_SHA,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_BMI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_SOCKET_PCI_NUMA_AFFINITY _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 06 10:25:59 compute-1 nova_compute[228576]: 2025-12-06 10:25:59.605 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:26:00 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:26:00 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3387854361' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:26:00 compute-1 nova_compute[228576]: 2025-12-06 10:26:00.061 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:26:00 compute-1 nova_compute[228576]: 2025-12-06 10:26:00.069 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:26:00 compute-1 nova_compute[228576]: 2025-12-06 10:26:00.102 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:26:00 compute-1 nova_compute[228576]: 2025-12-06 10:26:00.104 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:26:00 compute-1 nova_compute[228576]: 2025-12-06 10:26:00.105 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.923s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:26:00 compute-1 nova_compute[228576]: 2025-12-06 10:26:00.105 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:00 compute-1 nova_compute[228576]: 2025-12-06 10:26:00.106 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 06 10:26:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:26:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:00.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:00 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:00 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:00.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:00 compute-1 nova_compute[228576]: 2025-12-06 10:26:00.257 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 06 10:26:00 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3387854361' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:26:00 compute-1 ceph-mon[79770]: pgmap v1348: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:26:01 compute-1 sudo[248426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:26:01 compute-1 sudo[248426]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:26:01 compute-1 sudo[248426]: pam_unix(sudo:session): session closed for user root
Dec 06 10:26:01 compute-1 sudo[248451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Dec 06 10:26:01 compute-1 sudo[248451]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:26:01 compute-1 podman[248550]: 2025-12-06 10:26:01.881593518 +0000 UTC m=+0.059102930 container exec 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Dec 06 10:26:01 compute-1 podman[248550]: 2025-12-06 10:26:01.98869579 +0000 UTC m=+0.166205162 container exec_died 500f8c89b5c281d0ace139835b07ea5bc6259ce3e028d8284d57da97424b55e0 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-crash-compute-1, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325)
Dec 06 10:26:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:26:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:26:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:26:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:26:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000026s ======
Dec 06 10:26:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:02.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Dec 06 10:26:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:02.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:02 compute-1 nova_compute[228576]: 2025-12-06 10:26:02.220 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:02 compute-1 nova_compute[228576]: 2025-12-06 10:26:02.220 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:02 compute-1 ceph-mon[79770]: pgmap v1349: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:26:02 compute-1 podman[248665]: 2025-12-06 10:26:02.40951755 +0000 UTC m=+0.046858166 container exec 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:26:02 compute-1 podman[248665]: 2025-12-06 10:26:02.41757446 +0000 UTC m=+0.054915096 container exec_died 6af22af7046e22bedbb2fb280e4d2c530c5b3cac3959f396bf7fe3d14752a7eb (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 06 10:26:02 compute-1 podman[248759]: 2025-12-06 10:26:02.708660175 +0000 UTC m=+0.051594034 container exec 044fb2629765feb8ffd5fd258951cd4533635db83b13cd8de7feeb48e81aeb97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Dec 06 10:26:02 compute-1 podman[248759]: 2025-12-06 10:26:02.721465473 +0000 UTC m=+0.064399302 container exec_died 044fb2629765feb8ffd5fd258951cd4533635db83b13cd8de7feeb48e81aeb97 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Dec 06 10:26:02 compute-1 podman[248823]: 2025-12-06 10:26:02.911756833 +0000 UTC m=+0.048593429 container exec 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec 06 10:26:02 compute-1 podman[248823]: 2025-12-06 10:26:02.92450741 +0000 UTC m=+0.061344006 container exec_died 70891cd2190622057f9c45299e27938f7b2105f0244eda3658dedfb18fed50f0 (image=quay.io/ceph/haproxy:2.3, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-haproxy-nfs-cephfs-compute-1-jmdafd)
Dec 06 10:26:03 compute-1 podman[248890]: 2025-12-06 10:26:03.113020715 +0000 UTC m=+0.046926287 container exec c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, io.buildah.version=1.28.2, distribution-scope=public, architecture=x86_64, name=keepalived, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, vcs-type=git, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc.)
Dec 06 10:26:03 compute-1 podman[248890]: 2025-12-06 10:26:03.127502105 +0000 UTC m=+0.061407677 container exec_died c8ec7212805c01399bc295ce2c5e69b11fbde393e887859b5ab336e81cd6d1f1 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-keepalived-nfs-cephfs-compute-1-uzbtlt, com.redhat.component=keepalived-container, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, version=2.2.4, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793)
Dec 06 10:26:03 compute-1 sudo[248451]: pam_unix(sudo:session): session closed for user root
Dec 06 10:26:03 compute-1 sudo[248920]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:26:03 compute-1 sudo[248920]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:26:03 compute-1 sudo[248920]: pam_unix(sudo:session): session closed for user root
Dec 06 10:26:03 compute-1 sudo[248945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:26:03 compute-1 sudo[248945]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:26:03 compute-1 nova_compute[228576]: 2025-12-06 10:26:03.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:03 compute-1 sudo[248945]: pam_unix(sudo:session): session closed for user root
Dec 06 10:26:04 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:26:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:04.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:04 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:04 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:04.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:26:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:26:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:26:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:26:04 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Dec 06 10:26:05 compute-1 ceph-mon[79770]: pgmap v1350: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:26:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Dec 06 10:26:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:26:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:26:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:26:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:26:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:26:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:26:05 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:26:05 compute-1 ceph-mon[79770]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Dec 06 10:26:05 compute-1 nova_compute[228576]: 2025-12-06 10:26:05.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:05 compute-1 nova_compute[228576]: 2025-12-06 10:26:05.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:26:05 compute-1 nova_compute[228576]: 2025-12-06 10:26:05.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:26:05 compute-1 nova_compute[228576]: 2025-12-06 10:26:05.517 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:26:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:06.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:06.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:06 compute-1 ceph-mon[79770]: pgmap v1351: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 602 B/s rd, 0 op/s
Dec 06 10:26:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:26:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:26:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:26:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:26:07 compute-1 ceph-mon[79770]: pgmap v1352: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 602 B/s rd, 0 op/s
Dec 06 10:26:07 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3490516856' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:26:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:08.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:08.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:08 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/239905449' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:26:09 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:09 compute-1 sudo[249004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:26:09 compute-1 sudo[249004]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:26:09 compute-1 sudo[249004]: pam_unix(sudo:session): session closed for user root
Dec 06 10:26:10 compute-1 ceph-mon[79770]: pgmap v1353: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 602 B/s rd, 0 op/s
Dec 06 10:26:10 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:26:10 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:26:10 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:26:10 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:26:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:26:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:10.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:10 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:10 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:10.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:11 compute-1 podman[249030]: 2025-12-06 10:26:11.830686432 +0000 UTC m=+0.133222423 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 06 10:26:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:26:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:26:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:26:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:26:12 compute-1 ceph-mon[79770]: pgmap v1354: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 602 B/s rd, 0 op/s
Dec 06 10:26:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:26:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:12 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:12.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:12 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:12.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:14 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:14 compute-1 ceph-mon[79770]: pgmap v1355: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 602 B/s rd, 0 op/s
Dec 06 10:26:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:26:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:14 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:14.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:14 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:14.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:16 compute-1 ceph-mon[79770]: pgmap v1356: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 602 B/s rd, 0 op/s
Dec 06 10:26:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:26:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:16.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:16 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:16 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:16.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:16 compute-1 podman[249059]: 2025-12-06 10:26:16.76044385 +0000 UTC m=+0.065659993 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 06 10:26:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:26:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:26:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:26:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:26:17 compute-1 sudo[249080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:26:17 compute-1 sudo[249080]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:26:17 compute-1 sudo[249080]: pam_unix(sudo:session): session closed for user root
Dec 06 10:26:18 compute-1 ceph-mon[79770]: pgmap v1357: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:26:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:18.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:18.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:19 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:19 compute-1 podman[249106]: 2025-12-06 10:26:19.75104411 +0000 UTC m=+0.058086675 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 06 10:26:20 compute-1 ceph-mon[79770]: pgmap v1358: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.129506) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780129659, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1286, "num_deletes": 255, "total_data_size": 2971840, "memory_usage": 3005568, "flush_reason": "Manual Compaction"}
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780142596, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 1942990, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39354, "largest_seqno": 40635, "table_properties": {"data_size": 1937422, "index_size": 2900, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12264, "raw_average_key_size": 19, "raw_value_size": 1926054, "raw_average_value_size": 3116, "num_data_blocks": 125, "num_entries": 618, "num_filter_entries": 618, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765016684, "oldest_key_time": 1765016684, "file_creation_time": 1765016780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 13163 microseconds, and 6658 cpu microseconds.
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.142688) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 1942990 bytes OK
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.142708) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.144111) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.144124) EVENT_LOG_v1 {"time_micros": 1765016780144120, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.144161) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 2965655, prev total WAL file size 2965655, number of live WAL files 2.
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.145201) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303035' seq:72057594037927935, type:22 .. '6C6F676D0031323536' seq:0, type:0; will stop at (end)
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(1897KB)], [75(12MB)]
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780145342, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15268690, "oldest_snapshot_seqno": -1}
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6854 keys, 15100115 bytes, temperature: kUnknown
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780218418, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 15100115, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15055066, "index_size": 26825, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 180460, "raw_average_key_size": 26, "raw_value_size": 14932341, "raw_average_value_size": 2178, "num_data_blocks": 1056, "num_entries": 6854, "num_filter_entries": 6854, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765014012, "oldest_key_time": 0, "file_creation_time": 1765016780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "79d3ff52-4bd2-4fdc-8b55-33dd9c53d8e0", "db_session_id": "1TK25AVRA1WQDS4JHM8T", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Dec 06 10:26:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Dec 06 10:26:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:20.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.218726) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 15100115 bytes
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.220231) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 208.7 rd, 206.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.7 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(15.6) write-amplify(7.8) OK, records in: 7383, records dropped: 529 output_compression: NoCompression
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.220255) EVENT_LOG_v1 {"time_micros": 1765016780220244, "job": 46, "event": "compaction_finished", "compaction_time_micros": 73167, "compaction_time_cpu_micros": 33140, "output_level": 6, "num_output_files": 1, "total_output_size": 15100115, "num_input_records": 7383, "num_output_records": 6854, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Dec 06 10:26:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780220958, "job": 46, "event": "table_file_deletion", "file_number": 77}
Dec 06 10:26:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:20.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765016780224572, "job": 46, "event": "table_file_deletion", "file_number": 75}
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.145116) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.224686) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.224694) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.224696) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.224698) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:20 compute-1 ceph-mon[79770]: rocksdb: (Original Log Time 2025/12/06-10:26:20.224701) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Dec 06 10:26:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:26:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:26:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:26:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:26:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:26:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:22.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:22 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:26:22 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:22.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:26:22 compute-1 ceph-mon[79770]: pgmap v1359: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:26:23 compute-1 ceph-mon[79770]: pgmap v1360: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:26:24 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:24.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:24.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:26:25 compute-1 ceph-mon[79770]: pgmap v1361: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:26:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:26.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:26.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:26:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:26:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:26:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:26:27 compute-1 ceph-mon[79770]: pgmap v1362: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:26:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:28.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:28.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:29 compute-1 ceph-mon[79770]: pgmap v1363: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:26:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:30.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:30.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:31 compute-1 ceph-mon[79770]: pgmap v1364: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:26:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:26:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:26:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:26:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:26:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:32.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:32.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:33 compute-1 ceph-mon[79770]: pgmap v1365: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:26:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:34.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:34.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:35 compute-1 ceph-mon[79770]: pgmap v1366: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:26:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:36.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:36.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:26:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:26:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:26:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:26:37 compute-1 ceph-mon[79770]: pgmap v1367: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:26:38 compute-1 sudo[249135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:26:38 compute-1 sudo[249135]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:26:38 compute-1 sudo[249135]: pam_unix(sudo:session): session closed for user root
Dec 06 10:26:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:38.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:38.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:39 compute-1 sshd-session[249161]: Connection closed by 117.50.226.213 port 45384
Dec 06 10:26:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:39 compute-1 ceph-mon[79770]: pgmap v1368: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:26:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:26:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:40.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:40.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:41 compute-1 ceph-mon[79770]: pgmap v1369: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:26:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:26:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:26:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:26:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:26:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:42.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:42.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:42 compute-1 podman[249164]: 2025-12-06 10:26:42.796414978 +0000 UTC m=+0.093712290 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 06 10:26:43 compute-1 ceph-mon[79770]: pgmap v1370: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:26:44 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:44.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:26:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:44.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:26:46 compute-1 ceph-mon[79770]: pgmap v1371: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:26:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:46.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:46 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:46 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:46 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:46.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:26:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:26:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:46 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:26:47 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:47 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:26:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/430995373' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Dec 06 10:26:47 compute-1 ceph-mon[79770]: from='client.? 192.168.122.10:0/430995373' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Dec 06 10:26:47 compute-1 ceph-mon[79770]: pgmap v1372: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:26:47 compute-1 podman[249192]: 2025-12-06 10:26:47.794597177 +0000 UTC m=+0.095510855 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 06 10:26:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:48.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:48 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:48 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:48 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:48.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:49 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:49 compute-1 ceph-mon[79770]: pgmap v1373: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:26:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:50.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:50 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:50 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:50 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:50.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:50 compute-1 podman[249213]: 2025-12-06 10:26:50.767130238 +0000 UTC m=+0.071490098 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 06 10:26:51 compute-1 ceph-mon[79770]: pgmap v1374: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:26:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:26:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:26:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:51 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:26:52 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:52 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:26:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:52.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:52 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:52 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:52 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:52.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:53 compute-1 ceph-mon[79770]: pgmap v1375: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:26:54 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:26:54.302 141446 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:26:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:26:54.303 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:26:54 compute-1 ovn_metadata_agent[141441]: 2025-12-06 10:26:54.303 141446 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:26:54 compute-1 nova_compute[228576]: 2025-12-06 10:26:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:54 compute-1 nova_compute[228576]: 2025-12-06 10:26:54.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:54 compute-1 nova_compute[228576]: 2025-12-06 10:26:54.472 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 06 10:26:54 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:26:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:54.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:54 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:54 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:26:54 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:54.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:26:55 compute-1 ceph-mon[79770]: pgmap v1376: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:26:55 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3570750853' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:26:56 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3927824637' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:26:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:26:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:56.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:26:56 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:56 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:56 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:56.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:26:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:26:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:56 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:26:57 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:26:57 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:26:57 compute-1 nova_compute[228576]: 2025-12-06 10:26:57.472 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:57 compute-1 nova_compute[228576]: 2025-12-06 10:26:57.472 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:57 compute-1 ceph-mon[79770]: pgmap v1377: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:26:58 compute-1 sudo[249238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:26:58 compute-1 sudo[249238]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:26:58 compute-1 sudo[249238]: pam_unix(sudo:session): session closed for user root
Dec 06 10:26:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:26:58.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:58 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:26:58 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:26:58 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:26:58.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:26:59 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:26:59 compute-1 nova_compute[228576]: 2025-12-06 10:26:59.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:26:59 compute-1 nova_compute[228576]: 2025-12-06 10:26:59.494 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:26:59 compute-1 nova_compute[228576]: 2025-12-06 10:26:59.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:26:59 compute-1 nova_compute[228576]: 2025-12-06 10:26:59.495 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:26:59 compute-1 nova_compute[228576]: 2025-12-06 10:26:59.495 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 06 10:26:59 compute-1 nova_compute[228576]: 2025-12-06 10:26:59.495 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:26:59 compute-1 ceph-mon[79770]: pgmap v1378: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:26:59 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:26:59 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2022403802' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:26:59 compute-1 nova_compute[228576]: 2025-12-06 10:26:59.951 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:27:00 compute-1 nova_compute[228576]: 2025-12-06 10:27:00.102 228580 WARNING nova.virt.libvirt.driver [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 06 10:27:00 compute-1 nova_compute[228576]: 2025-12-06 10:27:00.104 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5169MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 06 10:27:00 compute-1 nova_compute[228576]: 2025-12-06 10:27:00.104 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 06 10:27:00 compute-1 nova_compute[228576]: 2025-12-06 10:27:00.104 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 06 10:27:00 compute-1 nova_compute[228576]: 2025-12-06 10:27:00.179 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 06 10:27:00 compute-1 nova_compute[228576]: 2025-12-06 10:27:00.179 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 06 10:27:00 compute-1 nova_compute[228576]: 2025-12-06 10:27:00.221 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 06 10:27:00 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2022403802' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:27:00 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Dec 06 10:27:00 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3629784107' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:27:00 compute-1 nova_compute[228576]: 2025-12-06 10:27:00.644 228580 DEBUG oslo_concurrency.processutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 06 10:27:00 compute-1 nova_compute[228576]: 2025-12-06 10:27:00.651 228580 DEBUG nova.compute.provider_tree [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed in ProviderTree for provider: ff2f17cb-ff1d-4da7-9560-4be741380cb1 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 06 10:27:00 compute-1 nova_compute[228576]: 2025-12-06 10:27:00.675 228580 DEBUG nova.scheduler.client.report [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Inventory has not changed for provider ff2f17cb-ff1d-4da7-9560-4be741380cb1 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 06 10:27:00 compute-1 nova_compute[228576]: 2025-12-06 10:27:00.676 228580 DEBUG nova.compute.resource_tracker [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 06 10:27:00 compute-1 nova_compute[228576]: 2025-12-06 10:27:00.676 228580 DEBUG oslo_concurrency.lockutils [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.572s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 06 10:27:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:00.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:00 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:00 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:00 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:00.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:01 compute-1 ceph-mon[79770]: pgmap v1379: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:27:01 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3629784107' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:27:01 compute-1 nova_compute[228576]: 2025-12-06 10:27:01.670 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:27:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:27:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:01 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:27:02 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:02 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:27:02 compute-1 nova_compute[228576]: 2025-12-06 10:27:02.470 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:02.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:02 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:02 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:27:02 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:02.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:27:03 compute-1 nova_compute[228576]: 2025-12-06 10:27:03.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:03 compute-1 ceph-mon[79770]: pgmap v1380: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:27:04 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:04 compute-1 nova_compute[228576]: 2025-12-06 10:27:04.464 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:27:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:04.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:27:04 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:04 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:04 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:04.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:05 compute-1 ceph-mon[79770]: pgmap v1381: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:27:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:06.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:06 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:06 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:27:06 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:06.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:27:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:27:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:27:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:06 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:27:07 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:07 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:27:07 compute-1 nova_compute[228576]: 2025-12-06 10:27:07.471 228580 DEBUG oslo_service.periodic_task [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 06 10:27:07 compute-1 nova_compute[228576]: 2025-12-06 10:27:07.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 06 10:27:07 compute-1 nova_compute[228576]: 2025-12-06 10:27:07.471 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 06 10:27:07 compute-1 nova_compute[228576]: 2025-12-06 10:27:07.494 228580 DEBUG nova.compute.manager [None req-3bbd3549-369b-4c8f-b3d7-c679686facd3 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 06 10:27:07 compute-1 ceph-mon[79770]: pgmap v1382: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:27:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:27:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:08.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:27:08 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:08 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:08 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:08.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:09 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:09 compute-1 ceph-mon[79770]: pgmap v1383: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:27:09 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:27:09 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1908876995' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:27:09 compute-1 sudo[249313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Dec 06 10:27:09 compute-1 sudo[249313]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:27:09 compute-1 sudo[249313]: pam_unix(sudo:session): session closed for user root
Dec 06 10:27:09 compute-1 sudo[249338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/5ecd3f74-dade-5fc4-92ce-8950ae424258/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Dec 06 10:27:09 compute-1 sudo[249338]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:27:10 compute-1 sudo[249338]: pam_unix(sudo:session): session closed for user root
Dec 06 10:27:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000024s ======
Dec 06 10:27:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:10.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000024s
Dec 06 10:27:10 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:10 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:10 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:10.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:10 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3374521652' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Dec 06 10:27:11 compute-1 ceph-mon[79770]: pgmap v1384: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:27:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:11 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:27:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:27:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:27:12 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:12 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:27:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:27:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:12.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:27:12 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:12 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:12 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:12.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:13 compute-1 podman[249396]: 2025-12-06 10:27:13.841466657 +0000 UTC m=+0.143373755 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec 06 10:27:13 compute-1 ceph-mon[79770]: pgmap v1385: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:27:14 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:14.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:14 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:14 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:14 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:14.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:14 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:27:14 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:27:14 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:27:14 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Dec 06 10:27:14 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:27:14 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:27:14 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Dec 06 10:27:14 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Dec 06 10:27:14 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:27:15 compute-1 ceph-mon[79770]: pgmap v1386: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:27:15 compute-1 ceph-mon[79770]: pgmap v1387: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 590 B/s rd, 0 op/s
Dec 06 10:27:15 compute-1 ceph-mon[79770]: Health check update: 2 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Dec 06 10:27:15 compute-1 sshd-session[249425]: Accepted publickey for zuul from 192.168.122.10 port 39518 ssh2: ECDSA SHA256:r1j7aLsKAM+XxDNbzEU5vWGpGNCOaIBwc7FZdATPttA
Dec 06 10:27:15 compute-1 systemd-logind[788]: New session 58 of user zuul.
Dec 06 10:27:16 compute-1 systemd[1]: Started Session 58 of User zuul.
Dec 06 10:27:16 compute-1 sshd-session[249425]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 06 10:27:16 compute-1 sudo[249429]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 06 10:27:16 compute-1 sudo[249429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 06 10:27:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:27:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:16.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:27:16 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:16 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:16 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:16.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:27:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:27:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:16 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:27:17 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:17 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:27:17 compute-1 ceph-mon[79770]: pgmap v1388: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 590 B/s rd, 0 op/s
Dec 06 10:27:18 compute-1 sudo[249563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:27:18 compute-1 sudo[249563]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:27:18 compute-1 sudo[249563]: pam_unix(sudo:session): session closed for user root
Dec 06 10:27:18 compute-1 podman[249604]: 2025-12-06 10:27:18.31430132 +0000 UTC m=+0.060803493 container health_status 4b9f96debb9ad7b4577ada81982115281f208293a2afb9ce73415d24523bc02b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 06 10:27:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:27:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:18.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:27:18 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:18 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:18 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:18.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:19 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:19 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Dec 06 10:27:19 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/725242139' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 06 10:27:19 compute-1 ceph-mon[79770]: from='client.26756 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:19 compute-1 ceph-mon[79770]: from='client.28039 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:19 compute-1 ceph-mon[79770]: from='client.18615 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:19 compute-1 ceph-mon[79770]: pgmap v1389: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 590 B/s rd, 0 op/s
Dec 06 10:27:19 compute-1 ceph-mon[79770]: from='client.26765 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:19 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/725242139' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 06 10:27:19 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/753713519' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 06 10:27:19 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3613403269' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Dec 06 10:27:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:20.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:20 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:20 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:20 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:20.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:20 compute-1 sudo[249763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Dec 06 10:27:20 compute-1 sudo[249763]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:27:20 compute-1 sudo[249763]: pam_unix(sudo:session): session closed for user root
Dec 06 10:27:20 compute-1 podman[249787]: 2025-12-06 10:27:20.946960084 +0000 UTC m=+0.067006337 container health_status ea065256e56fb7d0c24012a05a267b67f07ce4c3d4e7f787c6d247affbd6ae59 (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 06 10:27:21 compute-1 ceph-mon[79770]: from='client.28045 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:21 compute-1 ceph-mon[79770]: from='client.18624 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:21 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:27:21 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' 
Dec 06 10:27:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:27:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:27:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:21 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:27:22 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:22 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:27:22 compute-1 ceph-mon[79770]: pgmap v1390: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 590 B/s rd, 0 op/s
Dec 06 10:27:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:27:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:22.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:27:22 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:22 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:22 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:22.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:22 compute-1 ovs-vsctl[249841]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 06 10:27:23 compute-1 virtqemud[228188]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 06 10:27:23 compute-1 virtqemud[228188]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 06 10:27:23 compute-1 virtqemud[228188]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 06 10:27:24 compute-1 ceph-mon[79770]: pgmap v1391: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 590 B/s rd, 0 op/s
Dec 06 10:27:24 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:27:24 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:24 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: cache status {prefix=cache status} (starting...)
Dec 06 10:27:24 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:27:24 compute-1 lvm[250155]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Dec 06 10:27:24 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: client ls {prefix=client ls} (starting...)
Dec 06 10:27:24 compute-1 lvm[250155]: VG ceph_vg0 finished
Dec 06 10:27:24 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:27:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:24.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:24 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:24 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:24 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:24.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: damage ls {prefix=damage ls} (starting...)
Dec 06 10:27:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:27:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump loads {prefix=dump loads} (starting...)
Dec 06 10:27:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:27:25 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Dec 06 10:27:25 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1297629055' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 06 10:27:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Dec 06 10:27:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:27:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Dec 06 10:27:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:27:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Dec 06 10:27:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:27:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Dec 06 10:27:25 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:27:26 compute-1 ceph-mon[79770]: from='client.28060 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:26 compute-1 ceph-mon[79770]: pgmap v1392: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 590 B/s rd, 0 op/s
Dec 06 10:27:26 compute-1 ceph-mon[79770]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 06 10:27:26 compute-1 ceph-mon[79770]: from='client.26783 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2556019190' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 06 10:27:26 compute-1 ceph-mon[79770]: from='client.28072 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:26 compute-1 ceph-mon[79770]: from='client.28078 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2493359472' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 06 10:27:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1297629055' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 06 10:27:26 compute-1 ceph-mon[79770]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Dec 06 10:27:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2975038871' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:27:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1789204808' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:27:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2998337513' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 06 10:27:26 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1843947048' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Dec 06 10:27:26 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Dec 06 10:27:26 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:27:26 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Dec 06 10:27:26 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/878022195' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 06 10:27:26 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: get subtrees {prefix=get subtrees} (starting...)
Dec 06 10:27:26 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:27:26 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: ops {prefix=ops} (starting...)
Dec 06 10:27:26 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:27:26 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Dec 06 10:27:26 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2966237147' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 06 10:27:26 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Dec 06 10:27:26 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/118631599' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 06 10:27:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:27:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:26.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:27:26 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:26 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:26 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:26.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:26 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:27:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:27:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:27:27 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:27 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:27:27 compute-1 ceph-mon[79770]: from='client.26801 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mon[79770]: from='client.18669 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mon[79770]: from='client.28105 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mon[79770]: from='client.26813 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mon[79770]: from='client.18687 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mon[79770]: from='client.28132 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3645085107' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/878022195' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3661082603' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2198453763' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4011143574' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2966237147' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/118631599' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3308889397' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3347068796' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 06 10:27:27 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/780400855' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:27:27 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: session ls {prefix=session ls} (starting...)
Dec 06 10:27:27 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb Can't run that command on an inactive MDS!
Dec 06 10:27:27 compute-1 ceph-mds[84241]: mds.cephfs.compute-1.fpvjgb asok_command: status {prefix=status} (starting...)
Dec 06 10:27:27 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 06 10:27:27 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2396899628' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Dec 06 10:27:28 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2579997392' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.26822 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.18699 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.28162 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: pgmap v1393: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.28174 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.28180 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.18738 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/780400855' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1082146717' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3948499551' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/98991412' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2396899628' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3505130251' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2140146032' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/837080414' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2585617435' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2579997392' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 06 10:27:28 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3061321017' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Dec 06 10:27:28 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2768020805' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 06 10:27:28 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:27:28 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/389558928' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:27:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:28.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:28 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:28 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:27:28 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:28.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:27:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Dec 06 10:27:29 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4226561449' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:29 compute-1 ceph-mon[79770]: from='client.18756 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: from='client.26870 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2329383087' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3061321017' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2397306980' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3859338692' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2768020805' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/389558928' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3683697799' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1751130465' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3275467326' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/4226561449' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/980335214' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/472944739' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Dec 06 10:27:29 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4135499701' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 06 10:27:29 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1023966935' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:27:29 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Dec 06 10:27:29 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2104238276' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 06 10:27:30 compute-1 ceph-mon[79770]: from='client.28252 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:30 compute-1 ceph-mon[79770]: from='client.18834 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:30 compute-1 ceph-mon[79770]: pgmap v1394: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:27:30 compute-1 ceph-mon[79770]: from='client.26921 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/933451133' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:27:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/409247307' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 06 10:27:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/4135499701' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Dec 06 10:27:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1023966935' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:27:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2837912887' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:27:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2687712743' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 06 10:27:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2133887337' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:27:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2104238276' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Dec 06 10:27:30 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/4240077914' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:27:30 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Dec 06 10:27:30 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3088289533' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:27:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:30.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:30 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:30 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:30 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:30.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:30 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Dec 06 10:27:30 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/907444341' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:21.292395+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:22.292588+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:23.292801+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:24.292967+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:25.293129+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:26.293290+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:27.293531+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:28.293701+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:29.293820+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:30.294001+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417d400 session 0x55fb252a5680
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:31.294354+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:32.294555+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:33.294835+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:34.295044+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:35.295277+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:36.295468+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:37.295692+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:38.295878+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:39.296066+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:40.296255+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 928389 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:41.296411+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5400
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 27.333505630s of 27.336708069s, submitted: 1
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83116032 unmapped: 1695744 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:42.296539+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83124224 unmapped: 1687552 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:43.296675+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83124224 unmapped: 1687552 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:44.296816+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [0,2])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 1662976 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:45.296952+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930049 data_alloc: 218103808 data_used: 135168
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 1662976 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:46.297113+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83148800 unmapped: 1662976 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:47.297273+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:48.297438+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:49.297627+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:50.297860+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930049 data_alloc: 218103808 data_used: 135168
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:51.298038+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:52.298245+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:53.298366+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83165184 unmapped: 1646592 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:54.298551+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.960668564s of 13.129460335s, submitted: 12
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:55.298722+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:56.298896+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:57.299065+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:58.299273+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:54:59.299493+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:00.299719+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:01.299882+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83173376 unmapped: 1638400 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:02.300055+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:03.300279+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:04.300433+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:05.300620+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:06.300766+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:07.300924+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:08.301111+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:09.301236+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:10.301453+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:11.301581+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:12.301728+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:13.301966+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:14.302117+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:15.302311+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:16.302468+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:17.302603+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:18.302768+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:19.302983+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:20.303218+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:21.303367+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:22.303519+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:23.303656+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:24.303799+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:25.303958+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:26.304087+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:27.304239+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:28.304383+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:29.304535+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:30.304744+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:31.304916+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417cc00 session 0x55fb271bda40
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:32.305085+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:33.305269+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:34.305446+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:35.305620+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:36.305846+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:37.306034+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:38.306205+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:39.306395+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:40.306846+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929310 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:41.307062+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:42.307259+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83181568 unmapped: 1630208 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd2400
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 48.355422974s of 48.445949554s, submitted: 1
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:43.307647+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:44.307846+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:45.308028+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 929442 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb25ef5400 session 0x55fb26d65860
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:46.308263+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:47.308432+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:48.308615+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:49.308905+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:50.309097+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930970 data_alloc: 218103808 data_used: 135168
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:51.309315+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:52.315094+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:53.315269+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:54.315431+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:55.315678+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930970 data_alloc: 218103808 data_used: 135168
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:56.315848+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.050199509s of 14.091516495s, submitted: 5
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:57.316080+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83222528 unmapped: 1589248 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:58.316881+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83279872 unmapped: 1531904 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:55:59.317071+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83279872 unmapped: 1531904 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:00.317446+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83296256 unmapped: 1515520 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930379 data_alloc: 218103808 data_used: 135168
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:01.317744+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83296256 unmapped: 1515520 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:02.318037+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83296256 unmapped: 1515520 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:03.318283+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:04.318548+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:05.318733+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931891 data_alloc: 218103808 data_used: 135168
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:06.318937+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:07.319164+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:08.319410+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83304448 unmapped: 1507328 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.031768799s of 12.082296371s, submitted: 15
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:09.319553+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83329024 unmapped: 1482752 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:10.322383+0000)
Dec 06 10:27:30 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83329024 unmapped: 1482752 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:30 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:30 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931284 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:30 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:30 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:11.322520+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:12.322698+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:13.322884+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:14.323088+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:15.323330+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:16.323502+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:17.323683+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83345408 unmapped: 1466368 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:18.323864+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:19.323992+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:20.324182+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:21.324322+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:22.324469+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:23.324698+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:24.324899+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:25.325045+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:26.325716+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:27.325878+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:28.326018+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:29.326193+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:30.326573+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:31.326719+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:32.326864+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:33.327078+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:34.327214+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:35.327392+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:36.327550+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:37.327708+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83361792 unmapped: 1449984 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:38.327846+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:39.327989+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:40.328196+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:41.328350+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:42.328515+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:43.328669+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:44.328913+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:45.329532+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:46.329703+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:47.329854+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:48.330262+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:49.330418+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:50.331423+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:51.332286+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:52.332451+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:53.332626+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:54.332800+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:55.332978+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:56.333179+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:57.333640+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:58.334062+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:56:59.334214+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:00.334416+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:01.334781+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:02.334932+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:03.335247+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:04.335428+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26f585a0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:05.335732+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:06.335979+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:07.336137+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:08.336302+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:09.336571+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:10.336794+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931152 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:11.337044+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:12.337219+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:13.337389+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:14.337726+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 66.122673035s of 66.133468628s, submitted: 3
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:15.337861+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931284 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:16.338041+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:17.338196+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83386368 unmapped: 1425408 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:18.338342+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:19.338539+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:20.338768+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931300 data_alloc: 218103808 data_used: 135168
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:21.338970+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:22.339176+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:23.339329+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:24.339522+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:25.339681+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 931300 data_alloc: 218103808 data_used: 135168
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:26.339848+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:27.340050+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.004903793s of 12.174333572s, submitted: 10
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:28.340235+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:29.340370+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:30.340593+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930693 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:31.340748+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83402752 unmapped: 1409024 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:32.340879+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:33.341043+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:34.341224+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:35.341375+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:36.341526+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:37.341689+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:38.341842+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:39.341972+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:40.342180+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:41.342337+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:42.342554+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:43.342729+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:44.342965+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:45.343203+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:46.343490+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:47.343676+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:48.343899+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:49.344056+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:50.344270+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:51.344442+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:52.344620+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:53.344806+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:54.344945+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:55.345132+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:56.345334+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:57.345535+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:58.345698+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:57:59.345883+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:00.346069+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:01.346225+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:02.346423+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:03.346603+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:04.346835+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:05.347071+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:06.347281+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:07.347463+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:08.347618+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:09.347784+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:10.347984+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:11.348253+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:12.348436+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:13.348660+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:14.348810+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:15.348943+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:16.349088+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:17.349284+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:18.349461+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:19.349593+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:20.349794+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:21.349980+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:22.350195+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:23.350361+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:24.350485+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:25.350620+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:26.350813+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:27.350939+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:28.351085+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:29.351318+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:30.351632+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:31.351834+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:32.351973+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:33.352095+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:34.352226+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:35.352410+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:36.352564+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:37.352721+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:38.352895+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:39.353064+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:40.353213+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:41.353335+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:42.353514+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:43.353679+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:44.353857+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:45.353986+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:46.354117+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83410944 unmapped: 1400832 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:47.354261+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:48.354414+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:49.354553+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:50.354817+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:51.354981+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:52.355119+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:53.355225+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:54.355399+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:55.355574+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:56.355742+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:57.355911+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:58.356078+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:58:59.356227+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:00.356486+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:01.356641+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:02.356863+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:03.357012+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:04.357175+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:05.357348+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Cumulative writes: 7804 writes, 31K keys, 7804 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 7804 writes, 1639 syncs, 4.76 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 625 writes, 1051 keys, 625 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s
                                           Interval WAL: 625 writes, 306 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.003       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227ce9b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 6e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.1 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55fb227cf350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.8e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:06.357561+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:07.357712+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:08.357891+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:09.358052+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:10.358257+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:11.358385+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:12.358513+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:13.358689+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:14.358867+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:15.359079+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26b301e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:16.359290+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:17.359475+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:18.359637+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:19.359802+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:20.360023+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:21.360183+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:22.360364+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:23.360525+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:24.360698+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:25.360827+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:26.360984+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 930561 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 119.449485779s of 119.461830139s, submitted: 2
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:27.361111+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:28.361224+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:29.361399+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:30.361631+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:31.361799+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932221 data_alloc: 218103808 data_used: 135168
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83419136 unmapped: 1392640 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:32.361949+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:33.362138+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:34.362277+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:35.362398+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:36.362516+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932221 data_alloc: 218103808 data_used: 135168
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:37.362672+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:38.362809+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:39.362973+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:40.363138+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.086344719s of 14.123706818s, submitted: 11
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:41.363281+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932089 data_alloc: 218103808 data_used: 135168
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:42.363654+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:43.363790+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:44.363951+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:45.364100+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:46.364276+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:47.364410+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:48.364544+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:49.364698+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:50.364936+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:51.365063+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83443712 unmapped: 1368064 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:52.365203+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:53.365335+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:54.365495+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:55.365667+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:56.365818+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread fragmentation_score=0.000028 took=0.000254s
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:57.366025+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:58.366254+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T09:59:59.366408+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:00.366682+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:01.366897+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:02.367119+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:03.367372+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:04.367553+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:05.367733+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:06.367925+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:07.368075+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:08.368251+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:09.368521+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:10.368731+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:11.368926+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:12.369220+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:13.369393+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:14.369673+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:15.369845+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:16.370047+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:17.370232+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:18.370420+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:19.370628+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:20.370888+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:21.371113+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:22.371387+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:23.371572+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:24.371900+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:25.372131+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:26.372370+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:27.372586+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:28.372759+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:29.372916+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:30.373199+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:31.373583+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:32.374062+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:33.374463+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:34.374881+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:35.375228+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83451904 unmapped: 1359872 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:36.375467+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:37.375758+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:38.376043+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:39.376256+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:40.376462+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:41.376647+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:42.376814+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:43.377051+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:44.377263+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:45.377417+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:46.377584+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:47.377722+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:48.377955+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:49.378214+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:50.378455+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:51.378642+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:52.378842+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:53.379058+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:54.379240+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:55.379435+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:56.379565+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:57.379718+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:58.379979+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:00:59.380256+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:00.380517+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:01.380720+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417d400 session 0x55fb26f51a40
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:02.381222+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83460096 unmapped: 1351680 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:03.381367+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:04.381505+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:05.381651+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:06.381812+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:07.381974+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:08.382133+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:09.382350+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:10.382537+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:11.382716+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83468288 unmapped: 1343488 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932073 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:12.382874+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 90.763069153s of 91.630355835s, submitted: 1
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:13.383056+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:14.383214+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:15.383366+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:16.383509+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932221 data_alloc: 218103808 data_used: 135168
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:17.383628+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83476480 unmapped: 1335296 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:18.383778+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:19.383973+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:20.384204+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:21.384343+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932974 data_alloc: 218103808 data_used: 135168
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:22.384539+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:23.384715+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:24.384865+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:25.385048+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83484672 unmapped: 1327104 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fca68000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:26.385203+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.195486069s of 14.243807793s, submitted: 12
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:27.385345+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:28.385485+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:29.385683+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83492864 unmapped: 1318912 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:30.385873+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83615744 unmapped: 1196032 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:31.385995+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,1])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:32.386199+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:33.386378+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:34.386514+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83877888 unmapped: 933888 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:35.386711+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:36.386875+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:37.387041+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:38.387209+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:39.387358+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83886080 unmapped: 925696 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:40.388999+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:41.389191+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:42.389351+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:43.389512+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:44.389696+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:45.389858+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:46.389997+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83894272 unmapped: 917504 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:47.390237+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:48.390400+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:49.390558+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:50.390769+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:51.390966+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:52.391165+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:53.391379+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83902464 unmapped: 909312 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:54.391544+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:55.391781+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:56.391936+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:57.392226+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:58.392382+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:01:59.392548+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:00.392771+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:01.393006+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:02.393297+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:03.393455+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83910656 unmapped: 901120 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:04.393693+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:05.393926+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:06.394103+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:07.394263+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:08.394419+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83918848 unmapped: 892928 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:09.394645+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:10.394912+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:11.395171+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:12.395325+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:13.395491+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:14.395618+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 ms_handle_reset con 0x55fb2417cc00 session 0x55fb268f6000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:15.395801+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:16.395944+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:17.396127+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:18.396372+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:19.396509+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:20.396709+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:21.396877+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 932994 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:22.397090+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:23.397240+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:24.397406+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 57.946811676s of 58.467189789s, submitted: 205
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:25.397605+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:26.397785+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 933126 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:27.397961+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:28.398135+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:29.398328+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:30.398514+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83927040 unmapped: 884736 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:31.398658+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 936166 data_alloc: 218103808 data_used: 135168
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:32.398823+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:33.398978+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:34.399124+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:35.399358+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:36.399513+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935407 data_alloc: 218103808 data_used: 135168
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:37.399698+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:38.399827+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:39.399999+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.869669914s of 14.905448914s, submitted: 12
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:40.400235+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:41.400375+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:42.400513+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:43.400784+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:44.400924+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:45.401112+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:46.401264+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:47.401456+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:48.401650+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:49.401852+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:50.402115+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:51.402283+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:52.402459+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:53.402650+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:54.402828+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:55.402973+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:56.403193+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:57.403371+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:58.403506+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:02:59.403653+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:00.403851+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:01.404023+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:02.404210+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:03.404360+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:04.404517+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:05.404649+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:06.404783+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:07.405041+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:08.405198+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:09.405332+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:10.405547+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:11.407332+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:12.408190+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:13.408573+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:14.410101+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:15.410931+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:16.411241+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:17.411453+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:18.411604+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:19.411845+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:20.412103+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:21.412303+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:22.412565+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:23.412794+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:24.412985+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:25.413179+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:26.413383+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:27.413605+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:28.413835+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:29.414131+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:30.414563+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:31.414891+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:32.415046+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:33.415237+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:34.415404+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:35.415566+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:36.415773+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:37.415920+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:38.416096+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:39.416277+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:40.416518+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:41.416683+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:42.416879+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:43.417044+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:44.417247+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:45.417417+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:46.417605+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:47.417706+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:48.417836+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:49.417968+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:50.418198+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:51.418392+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:52.418539+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:53.418701+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:54.418850+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:55.419040+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:56.419203+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:57.419392+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:58.419505+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:03:59.419677+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:00.419894+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:01.420058+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:02.420212+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:03.420462+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:04.420596+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:05.420824+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:06.421022+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83935232 unmapped: 876544 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:07.421253+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:08.421440+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:09.421660+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:10.421874+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:11.422045+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fc658000/0x0/0x4ffc00000, data 0x10001c/0x1b4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:12.422234+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 935427 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:13.422364+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 93.416572571s of 93.421234131s, submitted: 1
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:14.422527+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83943424 unmapped: 868352 heap: 84811776 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b43400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _renew_subs
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:15.422658+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 148 heartbeat osd_stat(store_statfs(0x4fc1e1000/0x0/0x4ffc00000, data 0x574248/0x62a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 83984384 unmapped: 10141696 heap: 94126080 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 149 ms_handle_reset con 0x55fb26b43400 session 0x55fb26f9b2c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:16.422783+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b43400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84090880 unmapped: 18432000 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:17.422943+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1056837 data_alloc: 218103808 data_used: 143360
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 150 ms_handle_reset con 0x55fb26b43400 session 0x55fb26c185a0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:18.423127+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 150 heartbeat osd_stat(store_statfs(0x4fb56a000/0x0/0x4ffc00000, data 0x11e847b/0x12a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:19.423349+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:20.423680+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 150 heartbeat osd_stat(store_statfs(0x4fb56a000/0x0/0x4ffc00000, data 0x11e847b/0x12a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:21.423915+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:22.424128+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1062395 data_alloc: 218103808 data_used: 143360
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:23.424360+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:24.424575+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:25.424818+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84115456 unmapped: 18407424 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:26.425034+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84115456 unmapped: 18407424 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:27.425226+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:28.425379+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:29.425534+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:30.425754+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:31.425957+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:32.426102+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:33.426264+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:34.426526+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:35.426698+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 ms_handle_reset con 0x55fb24bd2400 session 0x55fb26acb4a0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:36.426872+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:37.427055+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:38.427222+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:39.427404+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:40.427610+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26db8d20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:41.427787+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:42.427981+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1065153 data_alloc: 218103808 data_used: 143360
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:43.428260+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:44.428437+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb567000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:45.428633+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:46.428802+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 33.286060333s of 33.423255920s, submitted: 40
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:47.429011+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84123648 unmapped: 18399232 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1064445 data_alloc: 218103808 data_used: 143360
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:48.429233+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:49.429399+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:50.429672+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:51.429869+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:52.430038+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1064593 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:53.430268+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84172800 unmapped: 18350080 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:54.430460+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84172800 unmapped: 18350080 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:55.430619+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84172800 unmapped: 18350080 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:56.430781+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:57.430976+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1063986 data_alloc: 218103808 data_used: 143360
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:58.431223+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84131840 unmapped: 18391040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:04:59.431393+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.490637779s of 12.528245926s, submitted: 11
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:00.431667+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:01.431837+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:02.432022+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1063702 data_alloc: 218103808 data_used: 139264
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:03.432295+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 84140032 unmapped: 18382848 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:04.432488+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 ms_handle_reset con 0x55fb2417d400 session 0x55fb268d2b40
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 88793088 unmapped: 13729792 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:05.432813+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 151 handle_osd_map epochs [151,152], i have 151, src has [1,152]
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 89841664 unmapped: 12681216 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 152 heartbeat osd_stat(store_statfs(0x4fb568000/0x0/0x4ffc00000, data 0x11ea44d/0x12a4000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:06.433018+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _renew_subs
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 152 handle_osd_map epochs [153,153], i have 152, src has [1,153]
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90734592 unmapped: 11788288 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 153 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb240bc3c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:07.433240+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90210304 unmapped: 12312576 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1113326 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:08.433505+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90243072 unmapped: 12279808 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:09.433725+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90243072 unmapped: 12279808 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 153 heartbeat osd_stat(store_statfs(0x4fb1bc000/0x0/0x4ffc00000, data 0x1592679/0x164e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:10.433934+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.364974976s of 10.899864197s, submitted: 40
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 153 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24f02b40
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90275840 unmapped: 12247040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:11.434108+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 90275840 unmapped: 12247040 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:12.434284+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1138423 data_alloc: 218103808 data_used: 8523776
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:13.434459+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:14.434617+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _renew_subs
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 153 handle_osd_map epochs [154,154], i have 153, src has [1,154]
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:15.434769+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:16.434932+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb268d3e00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:17.435209+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1142141 data_alloc: 218103808 data_used: 8527872
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:18.435365+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:19.435705+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:20.436020+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:21.437500+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:22.438211+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 93954048 unmapped: 8568832 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1142141 data_alloc: 218103808 data_used: 8527872
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.835918427s of 12.857731819s, submitted: 18
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:23.438585+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fb1b9000/0x0/0x4ffc00000, data 0x159466e/0x1652000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,1])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 96157696 unmapped: 6365184 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:24.439057+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fad9d000/0x0/0x4ffc00000, data 0x19a366e/0x1a61000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 99008512 unmapped: 3514368 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:25.439486+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 99008512 unmapped: 3514368 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9be3000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:26.439723+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 99008512 unmapped: 3514368 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:27.439902+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175489 data_alloc: 218103808 data_used: 8544256
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:28.440665+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:29.441251+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:30.441650+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:31.442090+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98172928 unmapped: 4349952 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:32.443058+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175673 data_alloc: 218103808 data_used: 8540160
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:33.443649+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.407831192s of 10.563117027s, submitted: 55
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:34.443977+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:35.444533+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:36.444937+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:37.445253+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175505 data_alloc: 218103808 data_used: 8540160
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:38.445488+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:39.445722+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:40.445929+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:41.446079+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:42.446316+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1175205 data_alloc: 218103808 data_used: 8540160
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:43.446523+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:44.446772+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b43400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43400 session 0x55fb271ad0e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 98181120 unmapped: 4341760 heap: 102522880 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:45.447080+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf9000/0x0/0x4ffc00000, data 0x19b566e/0x1a73000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b43800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.870066643s of 11.887675285s, submitted: 5
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43800 session 0x55fb271ac000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97779712 unmapped: 13139968 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:46.447321+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97779712 unmapped: 13139968 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:47.447532+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92da000/0x0/0x4ffc00000, data 0x22d466e/0x2392000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243431 data_alloc: 218103808 data_used: 8544256
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:48.447754+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:49.448061+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:50.448375+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b43c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43c00 session 0x55fb26a99e00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:51.448547+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92da000/0x0/0x4ffc00000, data 0x22d466e/0x2392000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97845248 unmapped: 13074432 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:52.448689+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb23791e00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243431 data_alloc: 218103808 data_used: 8544256
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97861632 unmapped: 13058048 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:53.448856+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26aca1e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b43400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43400 session 0x55fb2422b860
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97607680 unmapped: 13312000 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:54.449062+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b6000/0x0/0x4ffc00000, data 0x22f866e/0x23b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b43800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b6000/0x0/0x4ffc00000, data 0x22f866e/0x23b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 97607680 unmapped: 13312000 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:55.449220+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101277696 unmapped: 9641984 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:56.449400+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105553920 unmapped: 5365760 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:57.449551+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311187 data_alloc: 234881024 data_used: 17436672
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:58.449683+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:05:59.449834+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:00.450037+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 5332992 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:01.450233+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:02.450369+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1311187 data_alloc: 234881024 data_used: 17436672
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:03.450472+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f92b4000/0x0/0x4ffc00000, data 0x22f966e/0x23b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:04.450636+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:05.450735+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb24bb70e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 5324800 heap: 110919680 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:06.450846+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.669349670s of 20.780309677s, submitted: 12
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109428736 unmapped: 3588096 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:07.451013+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383089 data_alloc: 234881024 data_used: 18948096
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:08.451188+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:09.451406+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aff000/0x0/0x4ffc00000, data 0x2aaf66e/0x2b6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:10.451621+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:11.451779+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110968832 unmapped: 2048000 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:12.451950+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aff000/0x0/0x4ffc00000, data 0x2aaf66e/0x2b6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1383089 data_alloc: 234881024 data_used: 18948096
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:13.452109+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aff000/0x0/0x4ffc00000, data 0x2aaf66e/0x2b6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:14.452288+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:15.452624+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:16.452893+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111001600 unmapped: 2015232 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42400 session 0x55fb26d654a0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b43800 session 0x55fb26da5c20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:17.453096+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.124892235s of 11.262226105s, submitted: 63
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25c3c400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb26a990e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1183609 data_alloc: 218103808 data_used: 7954432
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:18.453303+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf8000/0x0/0x4ffc00000, data 0x19b666e/0x1a74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:19.453471+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26a96f00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf8000/0x0/0x4ffc00000, data 0x19b666e/0x1a74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:20.453712+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:21.453886+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:22.454052+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9bf8000/0x0/0x4ffc00000, data 0x19b666e/0x1a74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1183625 data_alloc: 218103808 data_used: 7950336
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103227392 unmapped: 9789440 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:23.454239+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26da4780
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb23f50000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb23f50000 session 0x55fb26f39c20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:24.454396+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:25.454581+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:26.454761+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:27.454938+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1099353 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:28.455047+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:29.455219+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100892672 unmapped: 12124160 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:30.455418+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.781655312s of 12.873902321s, submitted: 32
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100900864 unmapped: 12115968 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:31.455588+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100900864 unmapped: 12115968 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:32.455693+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1099077 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100909056 unmapped: 12107776 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:33.455823+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:34.455976+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:35.456119+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:36.456258+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:37.456443+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100589 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:38.456616+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:39.456769+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:40.456966+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:41.457117+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:42.457321+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.064629555s of 12.113365173s, submitted: 14
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100573 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:43.457457+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:44.457656+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:45.457838+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:46.458033+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:47.458249+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1100441 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:48.458399+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:49.458580+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100917248 unmapped: 12099584 heap: 113016832 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:50.458772+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104062976 unmapped: 13164544 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26649680
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:51.458926+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:52.459217+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d46000/0x0/0x4ffc00000, data 0x186964b/0x1926000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1152691 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:53.459432+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d46000/0x0/0x4ffc00000, data 0x186964b/0x1926000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:54.459571+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100925440 unmapped: 16302080 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:55.459803+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100933632 unmapped: 16293888 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25c3c400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.373836517s of 13.440272331s, submitted: 18
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb240be1e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:56.459989+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100974592 unmapped: 16252928 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d46000/0x0/0x4ffc00000, data 0x186964b/0x1926000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:57.460217+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 100982784 unmapped: 16244736 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1179880 data_alloc: 218103808 data_used: 8491008
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:58.460391+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101285888 unmapped: 15941632 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:06:59.460617+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d45000/0x0/0x4ffc00000, data 0x186966e/0x1927000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:00.460863+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:01.461049+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:02.461244+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:03.461396+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192648 data_alloc: 234881024 data_used: 10371072
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d45000/0x0/0x4ffc00000, data 0x186966e/0x1927000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:04.461589+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:05.461751+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:06.461893+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d45000/0x0/0x4ffc00000, data 0x186966e/0x1927000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:07.462098+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:08.462258+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192648 data_alloc: 234881024 data_used: 10371072
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 101965824 unmapped: 15261696 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.659880638s of 12.670410156s, submitted: 4
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:09.462443+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106905600 unmapped: 10321920 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:10.462670+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:11.462852+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9049000/0x0/0x4ffc00000, data 0x255c66e/0x261a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:12.463020+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:13.463241+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1313916 data_alloc: 234881024 data_used: 12308480
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:14.463390+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 9469952 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9049000/0x0/0x4ffc00000, data 0x255c66e/0x261a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:15.463527+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:16.463736+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:17.463902+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:18.464085+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1306724 data_alloc: 234881024 data_used: 12316672
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106782720 unmapped: 10444800 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:19.464244+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:20.464426+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f904f000/0x0/0x4ffc00000, data 0x255f66e/0x261d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:21.464579+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:22.464749+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:23.464905+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1306724 data_alloc: 234881024 data_used: 12316672
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:24.465042+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f904f000/0x0/0x4ffc00000, data 0x255f66e/0x261d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:25.465223+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:26.465363+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106790912 unmapped: 10436608 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:27.465547+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:28.465704+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1306724 data_alloc: 234881024 data_used: 12316672
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:29.465879+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:30.466086+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106799104 unmapped: 10428416 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f904f000/0x0/0x4ffc00000, data 0x255f66e/0x261d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:31.466267+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106815488 unmapped: 10412032 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:32.466448+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106815488 unmapped: 10412032 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:33.466626+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308244 data_alloc: 234881024 data_used: 12402688
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106823680 unmapped: 10403840 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:34.466769+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 25.410940170s of 25.654003143s, submitted: 125
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106831872 unmapped: 10395648 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26da41e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:35.466896+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb24e9bc20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:36.467048+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:37.467198+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:38.467374+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:39.467715+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:40.467935+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:41.468100+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:42.468265+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:43.468416+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:44.468567+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:45.468775+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:46.468986+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:47.469191+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:48.469389+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:49.469543+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:50.469758+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:51.469957+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:52.470165+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:53.470308+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:54.470475+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:55.470688+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:56.470871+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:57.471110+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:58.471271+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1112810 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:07:59.471419+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ccc000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:00.471608+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:01.471747+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:02.471940+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 102948864 unmapped: 14278656 heap: 117227520 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26c92000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26c92000 session 0x55fb2422bc20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26acb2c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26acb860
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25c3c400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb26acb4a0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 28.730512619s of 28.779657364s, submitted: 23
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:03.472067+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26acaf00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24e881e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26aca1e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb24e88d20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25c3c400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163827 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb24e89c20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:04.472212+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:05.472371+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:06.472503+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26b30000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103374848 unmapped: 22249472 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb261b6c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb261b6c00 session 0x55fb26b301e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:07.472641+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26b30780
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103383040 unmapped: 22241280 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:08.472833+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163827 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103383040 unmapped: 22241280 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:09.473021+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103383040 unmapped: 22241280 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24f02b40
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb268fe1e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:10.473270+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25c3c400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103391232 unmapped: 22233088 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:11.473430+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 103391232 unmapped: 22233088 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:12.473569+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:13.473723+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1195423 data_alloc: 234881024 data_used: 9515008
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:14.473877+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:15.474024+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:16.474215+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:17.474363+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105062400 unmapped: 20561920 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9ddf000/0x0/0x4ffc00000, data 0x17cf65b/0x188d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:18.474506+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1195423 data_alloc: 234881024 data_used: 9515008
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104669184 unmapped: 20955136 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:19.474674+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104669184 unmapped: 20955136 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:20.474892+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 20946944 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb261b7800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.621421814s of 17.702753067s, submitted: 25
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:21.475036+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 104677376 unmapped: 20946944 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:22.475230+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105635840 unmapped: 19988480 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:23.475449+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1205371 data_alloc: 234881024 data_used: 9601024
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9cc9000/0x0/0x4ffc00000, data 0x18e565b/0x19a3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106078208 unmapped: 19546112 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:24.475600+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105332736 unmapped: 20291584 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:25.475804+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105332736 unmapped: 20291584 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c3e000/0x0/0x4ffc00000, data 0x196f65b/0x1a2d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:26.475969+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105332736 unmapped: 20291584 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:27.476098+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105340928 unmapped: 20283392 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:28.476271+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216501 data_alloc: 234881024 data_used: 9588736
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105340928 unmapped: 20283392 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:29.476423+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:30.476615+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:31.476780+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c20000/0x0/0x4ffc00000, data 0x198e65b/0x1a4c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:32.476929+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c20000/0x0/0x4ffc00000, data 0x198e65b/0x1a4c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.623050690s of 11.763713837s, submitted: 49
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:33.477138+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1216802 data_alloc: 234881024 data_used: 9592832
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:34.477404+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105472000 unmapped: 20152320 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:35.477640+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:36.477829+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:37.478286+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:38.478524+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c12000/0x0/0x4ffc00000, data 0x199c65b/0x1a5a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1217114 data_alloc: 234881024 data_used: 9592832
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:39.478693+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105529344 unmapped: 20094976 heap: 125624320 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:40.481371+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25fd8000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb240c32c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb240d5800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb240d5800 session 0x55fb24f023c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb268fc960
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26ae6960
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26da43c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25fd8000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb24bd4f00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e8c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e8c00 session 0x55fb24bd5c20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105783296 unmapped: 23511040 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb24bd5a40
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb24bd43c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:41.481603+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f93a0000/0x0/0x4ffc00000, data 0x220d66b/0x22cc000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105783296 unmapped: 23511040 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:42.481749+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:43.482230+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1284570 data_alloc: 234881024 data_used: 9592832
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:44.482636+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939d000/0x0/0x4ffc00000, data 0x221066b/0x22cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:45.483092+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:46.483607+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:47.484073+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105791488 unmapped: 23502848 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:48.484413+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1316490 data_alloc: 234881024 data_used: 14340096
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 111575040 unmapped: 17719296 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:49.484719+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112599040 unmapped: 16695296 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:50.485045+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939d000/0x0/0x4ffc00000, data 0x221066b/0x22cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112623616 unmapped: 16670720 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:51.485455+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112623616 unmapped: 16670720 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:52.485784+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939d000/0x0/0x4ffc00000, data 0x221066b/0x22cf000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 16637952 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:53.486015+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1342938 data_alloc: 234881024 data_used: 18280448
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 16637952 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:54.486412+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.930198669s of 22.021116257s, submitted: 16
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112656384 unmapped: 16637952 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:55.486608+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939b000/0x0/0x4ffc00000, data 0x221166b/0x22d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112689152 unmapped: 16605184 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:56.486814+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112689152 unmapped: 16605184 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:57.486982+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112689152 unmapped: 16605184 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:58.487162+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1343466 data_alloc: 234881024 data_used: 18317312
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112803840 unmapped: 16490496 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:08:59.487283+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f939b000/0x0/0x4ffc00000, data 0x221166b/0x22d0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114663424 unmapped: 14630912 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:00.487427+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 13910016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:01.487576+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 13901824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:02.487779+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 13836288 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:03.488009+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1362194 data_alloc: 234881024 data_used: 18333696
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9264000/0x0/0x4ffc00000, data 0x233366b/0x23f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 13836288 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:04.488194+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb261b7800 session 0x55fb25e2b860
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 13836288 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9264000/0x0/0x4ffc00000, data 0x233366b/0x23f2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:05.488445+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.1 total, 600.0 interval
                                           Cumulative writes: 9251 writes, 35K keys, 9251 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s
                                           Cumulative WAL: 9251 writes, 2253 syncs, 4.11 writes per sync, written: 0.03 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1447 writes, 4631 keys, 1447 commit groups, 1.0 writes per commit group, ingest: 5.55 MB, 0.01 MB/s
                                           Interval WAL: 1447 writes, 614 syncs, 2.36 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.082611084s of 11.189125061s, submitted: 41
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 13803520 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:06.488751+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 13803520 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:07.489007+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 13803520 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:08.489255+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24eda960
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1362210 data_alloc: 234881024 data_used: 18333696
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25fd8000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114360320 unmapped: 14934016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:09.489505+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb268fbc20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:10.489815+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c0e000/0x0/0x4ffc00000, data 0x19a065b/0x1a5e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:11.490084+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:12.490358+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:13.490542+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224948 data_alloc: 234881024 data_used: 9592832
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:14.490744+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109035520 unmapped: 20258816 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:15.491007+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb268d2780
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9c0e000/0x0/0x4ffc00000, data 0x19a065b/0x1a5e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25fd8000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.954633713s of 10.025353432s, submitted: 24
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb240c14a0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:16.491205+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:17.491466+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:18.491657+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1130496 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:19.491871+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:20.492114+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106209280 unmapped: 23085056 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:21.492264+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:22.492437+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:23.492624+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1130496 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:24.492778+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:25.492986+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:26.493258+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:27.493431+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:28.493622+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:29.493835+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:30.494033+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:31.494249+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:32.494399+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:33.494535+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:34.494685+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:35.494838+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:36.495035+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106250240 unmapped: 23044096 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:37.495196+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:38.495412+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:39.495585+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:40.495808+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:41.495968+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:42.496236+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:43.496384+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1129757 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105627648 unmapped: 23666688 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:44.496556+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.085247040s of 29.158304214s, submitted: 25
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26f503c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:45.496726+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9dda000/0x0/0x4ffc00000, data 0x17d564b/0x1892000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:46.496900+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:47.497034+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:48.497235+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1178729 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:49.497397+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb24bd74a0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb261b7800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 105439232 unmapped: 23855104 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:50.497648+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e9000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106323968 unmapped: 22970368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:51.497851+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:52.498024+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:53.498194+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222909 data_alloc: 234881024 data_used: 10895360
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:54.498334+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:55.498481+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:56.498627+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:57.498798+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106946560 unmapped: 22347776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:58.499042+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222909 data_alloc: 234881024 data_used: 10895360
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:09:59.499319+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:00.499583+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:01.499773+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 106954752 unmapped: 22339584 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:02.499934+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9db6000/0x0/0x4ffc00000, data 0x17f964b/0x18b6000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.328538895s of 17.374874115s, submitted: 11
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114114560 unmapped: 15179776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:03.500125+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297901 data_alloc: 234881024 data_used: 12738560
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:04.500327+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116293632 unmapped: 13000704 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:05.500574+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116301824 unmapped: 12992512 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:06.500755+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116301824 unmapped: 12992512 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:07.500938+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116301824 unmapped: 12992512 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:08.501138+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298205 data_alloc: 234881024 data_used: 12746752
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:09.501404+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:10.501851+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:11.502099+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116367360 unmapped: 12926976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:12.502250+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:13.502510+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298509 data_alloc: 234881024 data_used: 12754944
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:14.502680+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:15.502828+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:16.503000+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:17.503242+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:18.503484+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298509 data_alloc: 234881024 data_used: 12754944
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:19.503858+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116375552 unmapped: 12918784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:20.504209+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116391936 unmapped: 12902400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:21.504638+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116391936 unmapped: 12902400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:22.504822+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116391936 unmapped: 12902400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:23.505011+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 12894208 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299269 data_alloc: 234881024 data_used: 12775424
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:24.505220+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 12894208 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:25.505499+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116400128 unmapped: 12894208 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:26.505729+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:27.505975+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:28.506287+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1299269 data_alloc: 234881024 data_used: 12775424
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:29.506447+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9697000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:30.506666+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116408320 unmapped: 12886016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb24bd4b40
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb24bd5680
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb271ac780
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb271ac000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 28.524518967s of 28.681346893s, submitted: 84
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26c18d20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:31.506892+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25c3c400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25c3c400 session 0x55fb24bd5860
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26b31c20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb24bd43c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26ae72c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:32.507034+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9292000/0x0/0x4ffc00000, data 0x231c65b/0x23da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:33.507245+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:34.507442+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1323404 data_alloc: 234881024 data_used: 12775424
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114810880 unmapped: 14483456 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:35.507632+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114319360 unmapped: 14974976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:36.507800+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114319360 unmapped: 14974976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26f594a0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:37.507988+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25e41400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114319360 unmapped: 14974976 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:38.508167+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114360320 unmapped: 14934016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:39.508522+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1352973 data_alloc: 234881024 data_used: 16846848
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116736000 unmapped: 12558336 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:40.508702+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116776960 unmapped: 12517376 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:41.508927+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:42.509080+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:43.509240+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:44.509397+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1353581 data_alloc: 234881024 data_used: 16908288
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:45.509549+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116809728 unmapped: 12484608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:46.509714+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9291000/0x0/0x4ffc00000, data 0x231c67e/0x23db000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116842496 unmapped: 12451840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:47.509874+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116842496 unmapped: 12451840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:48.510074+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 116842496 unmapped: 12451840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.605432510s of 17.711872101s, submitted: 26
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:49.510318+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1406263 data_alloc: 234881024 data_used: 16941056
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118104064 unmapped: 11190272 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:50.510526+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118857728 unmapped: 10436608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:51.510710+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118857728 unmapped: 10436608 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8b0f000/0x0/0x4ffc00000, data 0x2a9e67e/0x2b5d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:52.510916+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:53.511089+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:54.511309+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1416385 data_alloc: 234881024 data_used: 16928768
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8b0f000/0x0/0x4ffc00000, data 0x2a9e67e/0x2b5d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:55.511480+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118890496 unmapped: 10403840 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:56.511659+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117432320 unmapped: 11862016 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:57.511858+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:58.512053+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:10:59.512212+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413273 data_alloc: 234881024 data_used: 16928768
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aeb000/0x0/0x4ffc00000, data 0x2ac267e/0x2b81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:00.512435+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.007748604s of 12.300899506s, submitted: 81
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:01.512648+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:02.512923+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e41400 session 0x55fb25e2ba40
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 117440512 unmapped: 11853824 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26c2a000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:03.513097+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8aeb000/0x0/0x4ffc00000, data 0x2ac267e/0x2b81000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:04.513279+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298825 data_alloc: 234881024 data_used: 12820480
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:05.513465+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f96a4000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:06.513632+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:07.513812+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:08.513965+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:09.514126+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298993 data_alloc: 234881024 data_used: 12820480
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:10.514822+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f96a4000/0x0/0x4ffc00000, data 0x1f0a64b/0x1fc7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:11.514961+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:12.515108+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:13.515319+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb26c192c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb261b7800 session 0x55fb24e9b680
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 113590272 unmapped: 15704064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.708964348s of 12.890141487s, submitted: 37
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb238f32c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:14.515507+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:15.515661+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:16.515917+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:17.516067+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:18.516212+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:19.516399+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:20.516632+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:21.516799+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:22.516955+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e58c00 session 0x55fb24e88780
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25e58c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e59000 session 0x55fb24bb8d20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb235d5c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:23.517122+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:24.517378+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e59c00 session 0x55fb24e890e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417cc00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:25.517555+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:26.517749+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb24edad20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb261b7800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:27.517934+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:28.518203+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:29.518395+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110436352 unmapped: 18857984 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:30.518579+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.401422501s of 16.425735474s, submitted: 9
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109821952 unmapped: 19472384 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:31.518792+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109920256 unmapped: 19374080 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:32.518924+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb25e2c000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:33.519111+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:34.519261+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1145549 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:35.519399+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:36.519562+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:37.519790+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:38.519958+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110018560 unmapped: 19275776 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3bf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef4800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb26da4b40
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:39.520178+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1166053 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:40.520386+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:41.520598+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:42.520770+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa220000/0x0/0x4ffc00000, data 0x138f64b/0x144c000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110075904 unmapped: 19218432 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:43.520931+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e9000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.438837051s of 13.030948639s, submitted: 230
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109223936 unmapped: 20070400 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26f503c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:44.521116+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26f50d20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1165513 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109207552 unmapped: 20086784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-mon[79770]: from='client.18879 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:27:31 compute-1 ceph-mon[79770]: from='client.26963 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb25107680
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:27:31 compute-1 ceph-mon[79770]: from='client.28321 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb25e2ba40
Dec 06 10:27:31 compute-1 ceph-mon[79770]: from='client.18894 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-mon[79770]: from='client.26969 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:45.521279+0000)
Dec 06 10:27:31 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1290425549' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109207552 unmapped: 20086784 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3088289533' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/713374450' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3308729779' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:46.521477+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa21e000/0x0/0x4ffc00000, data 0x138f67e/0x144e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/907444341' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:27:31 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2953526502' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109305856 unmapped: 19988480 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa21e000/0x0/0x4ffc00000, data 0x138f67e/0x144e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:47.521680+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109305856 unmapped: 19988480 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:48.521829+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109305856 unmapped: 19988480 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:49.522010+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182596 data_alloc: 218103808 data_used: 6209536
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:50.522216+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:51.522383+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:52.522489+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa21e000/0x0/0x4ffc00000, data 0x138f67e/0x144e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb24bb92c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb271ac780
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109330432 unmapped: 19963904 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:53.522610+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb268fd2c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:54.522743+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1153451 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:55.522854+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:56.522993+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107749376 unmapped: 21544960 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:57.523190+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21536768 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:58.523311+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107757568 unmapped: 21536768 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:11:59.523446+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1153603 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:00.523655+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.980836868s of 17.077106476s, submitted: 31
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:01.523828+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:02.523993+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:03.524181+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:04.524419+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1153471 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107765760 unmapped: 21528576 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:05.524574+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef4800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 20930560 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb26db9860
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:06.524722+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 20930560 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:07.524861+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 20922368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:08.525097+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 20922368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef4800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb24bd7c20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:09.525234+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1162335 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26f501e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 20922368 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:10.525418+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa2ea000/0x0/0x4ffc00000, data 0x12c564b/0x1382000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26f50f00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.982028008s of 10.000374794s, submitted: 6
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb243d0960
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 20914176 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:11.525552+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb26b42c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 20914176 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:12.525752+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 20914176 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb26b42c00 session 0x55fb26db83c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:13.525892+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26f510e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107429888 unmapped: 21864448 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:14.526177+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107438080 unmapped: 21856256 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:15.526315+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:16.526453+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:17.526619+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:18.526858+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:19.527112+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:20.527323+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:21.527475+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:22.527633+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:23.527969+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:24.528236+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:25.528670+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:26.528832+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:27.529024+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:28.529274+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107446272 unmapped: 21848064 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:29.529443+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107454464 unmapped: 21839872 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:30.529656+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:31.529857+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:32.530020+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:33.530211+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:34.530377+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1155300 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:35.530536+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:36.530738+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:37.530956+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107462656 unmapped: 21831680 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:38.531281+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4fa3be000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 21823488 heap: 129294336 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:39.531454+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 29.154647827s of 29.205055237s, submitted: 16
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26da4960
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1196112 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 24977408 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:40.531691+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 24977408 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:41.531854+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:42.532042+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9dbf000/0x0/0x4ffc00000, data 0x17f064b/0x18ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:43.532216+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:44.532396+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1196112 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107479040 unmapped: 24969216 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:45.532585+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9dbf000/0x0/0x4ffc00000, data 0x17f064b/0x18ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26c2b2c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107642880 unmapped: 24805376 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:46.532780+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef4800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 107642880 unmapped: 24805376 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:47.532924+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892dc00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:48.533111+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:49.533261+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243041 data_alloc: 234881024 data_used: 11091968
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:50.533451+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:51.533630+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:52.533767+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:53.533919+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb2719dc20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109158400 unmapped: 23289856 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:54.534028+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1243041 data_alloc: 234881024 data_used: 11091968
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109166592 unmapped: 23281664 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:55.534241+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109166592 unmapped: 23281664 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:56.534367+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9d9a000/0x0/0x4ffc00000, data 0x181466e/0x18d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109166592 unmapped: 23281664 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:57.534491+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: mgrc ms_handle_reset ms_handle_reset con 0x55fb26150000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/3885409716
Dec 06 10:27:31 compute-1 ceph-osd[77465]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/3885409716,v1:192.168.122.100:6801/3885409716]
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: get_auth_request con 0x55fb26b42c00 auth_method 0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: mgrc handle_mgr_configure stats_period=5
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109133824 unmapped: 23314432 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.384727478s of 18.432491302s, submitted: 9
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:58.534587+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9a06000/0x0/0x4ffc00000, data 0x1ba866e/0x1c66000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb268d21e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112214016 unmapped: 20234240 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:12:59.534747+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1283175 data_alloc: 234881024 data_used: 11640832
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112279552 unmapped: 20168704 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:00.534919+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112279552 unmapped: 20168704 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:01.535119+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112279552 unmapped: 20168704 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:02.535320+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:03.535505+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:04.535697+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1282131 data_alloc: 234881024 data_used: 11640832
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:05.535824+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:06.535955+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112271360 unmapped: 20176896 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:07.536087+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:08.536261+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:09.536418+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.296810150s of 11.368459702s, submitted: 31
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1282111 data_alloc: 234881024 data_used: 11636736
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:10.536635+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:11.536784+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99f8000/0x0/0x4ffc00000, data 0x1bb666e/0x1c74000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892dc00 session 0x55fb271ad680
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb268fe000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112263168 unmapped: 20185088 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:12.536936+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb25e2a1e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:13.537077+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:14.537272+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163642 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:15.537439+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109723648 unmapped: 22724608 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:16.537594+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109772800 unmapped: 22675456 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:17.537736+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:18.537982+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:19.538172+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.109940529s of 10.186728477s, submitted: 30
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163202 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:20.538476+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:21.538623+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:22.538779+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:23.539002+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:24.539222+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163070 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:25.539445+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:26.539711+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:27.539866+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:28.540061+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:29.540227+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163070 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:30.540458+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:31.540634+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:32.540899+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:33.541253+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109764608 unmapped: 22683648 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:34.541414+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1163070 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 109772800 unmapped: 22675456 heap: 132448256 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.609736443s of 15.626793861s, submitted: 5
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:35.541604+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb268d32c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:36.541765+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:37.541957+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:38.542130+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f99b7000/0x0/0x4ffc00000, data 0x17e864b/0x18a5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110092288 unmapped: 26034176 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:39.542319+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209438 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb26649e00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 25722880 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:40.542527+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110403584 unmapped: 25722880 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:41.542647+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef4800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb268f6960
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 110706688 unmapped: 25419776 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:42.542762+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:43.542897+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:44.543084+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1251390 data_alloc: 234881024 data_used: 10969088
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:45.543270+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:46.543437+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:47.543608+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:48.543778+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:49.543930+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1251390 data_alloc: 234881024 data_used: 10969088
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:50.544225+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:51.544384+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112590848 unmapped: 23535616 heap: 136126464 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9993000/0x0/0x4ffc00000, data 0x180c64b/0x18c9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e9000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.847808838s of 16.891704559s, submitted: 6
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb240c2f00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:52.544535+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 112812032 unmapped: 26992640 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892dc00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:53.544698+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115073024 unmapped: 24731648 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:54.544863+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115073024 unmapped: 24731648 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1386308 data_alloc: 234881024 data_used: 11198464
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:55.545043+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8877000/0x0/0x4ffc00000, data 0x292864b/0x29e5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892d800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d800 session 0x55fb240c10e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:56.545209+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb271acd20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26a99c20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb268fe1e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:57.545386+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e9000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:58.545586+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:13:59.545748+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119521280 unmapped: 20283392 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1471204 data_alloc: 234881024 data_used: 23625728
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:00.545969+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8874000/0x0/0x4ffc00000, data 0x292b64b/0x29e8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:01.546127+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:02.546310+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:03.546723+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:04.546877+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8874000/0x0/0x4ffc00000, data 0x292b64b/0x29e8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1472076 data_alloc: 234881024 data_used: 23629824
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:05.547071+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8874000/0x0/0x4ffc00000, data 0x292b64b/0x29e8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:06.547208+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:07.547352+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:08.547532+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125157376 unmapped: 14647296 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:09.547668+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 125214720 unmapped: 14589952 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.025381088s of 17.272668839s, submitted: 73
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f84cf000/0x0/0x4ffc00000, data 0x2ca464b/0x2d61000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1526224 data_alloc: 234881024 data_used: 23797760
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:10.547862+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129245184 unmapped: 10559488 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:11.548010+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129335296 unmapped: 10469376 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:12.548193+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 10387456 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:13.548360+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129417216 unmapped: 10387456 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f814b000/0x0/0x4ffc00000, data 0x305364b/0x3110000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:14.548525+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 10371072 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1530806 data_alloc: 234881024 data_used: 23859200
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:15.548712+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 10371072 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:16.548872+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129433600 unmapped: 10371072 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:17.549013+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129458176 unmapped: 10346496 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:18.549240+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:19.549409+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1528822 data_alloc: 234881024 data_used: 23859200
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:20.549616+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:21.549769+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:22.550139+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f8149000/0x0/0x4ffc00000, data 0x305664b/0x3113000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:23.550307+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:24.550456+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1528822 data_alloc: 234881024 data_used: 23859200
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:25.550600+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.850658417s of 15.982189178s, submitted: 58
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 129490944 unmapped: 10313728 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb26ae6b40
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb25e2cb40
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:26.550762+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:27.550941+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:28.551103+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f91ec000/0x0/0x4ffc00000, data 0x1c3d64b/0x1cfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:29.551297+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f91ec000/0x0/0x4ffc00000, data 0x1c3d64b/0x1cfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1305206 data_alloc: 234881024 data_used: 11198464
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:30.551503+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:31.551671+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:32.551838+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f91ec000/0x0/0x4ffc00000, data 0x1c3d64b/0x1cfa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119799808 unmapped: 20004864 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef4800 session 0x55fb25107c20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb268fad20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:33.551980+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115548160 unmapped: 24256512 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb24e894a0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:34.552179+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115556352 unmapped: 24248320 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:35.552351+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:36.552501+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:37.552674+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:38.552833+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:39.553012+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:40.553278+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:41.553421+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:42.553602+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:43.553766+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:44.554330+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:45.554552+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:46.554744+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:47.554952+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:48.555191+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:49.555495+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:50.555737+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:51.555920+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:52.556199+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:53.556356+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:54.556474+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:55.556691+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1182518 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:56.556921+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:57.557077+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:58.557227+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 24428544 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb24bd50e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e9000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb24f02000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb26ae7a40
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:14:59.557379+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb24bb85a0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 33.909008026s of 33.959445953s, submitted: 21
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb2422b680
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb26ae6000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb24bd5680
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e9000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115572736 unmapped: 24231936 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb24f03c20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb263e9000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb263e9000 session 0x55fb24bd4960
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:00.557610+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b13000/0x0/0x4ffc00000, data 0x168c64b/0x1749000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1222243 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:01.557764+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:02.557866+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b13000/0x0/0x4ffc00000, data 0x168c64b/0x1749000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:03.558010+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2417d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417d400 session 0x55fb26a972c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24bd1c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb24bd1c00 session 0x55fb240c32c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:04.558182+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25ef5800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25ef5800 session 0x55fb240c21e0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb24e89c20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:05.558342+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1224048 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115572736 unmapped: 24231936 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:06.558515+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892d400
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115580928 unmapped: 24223744 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:07.558649+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:08.558802+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b12000/0x0/0x4ffc00000, data 0x168c66e/0x174a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:09.558925+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:10.559092+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1249716 data_alloc: 218103808 data_used: 8503296
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:11.559290+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b12000/0x0/0x4ffc00000, data 0x168c66e/0x174a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:12.559502+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:13.559656+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115761152 unmapped: 24043520 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:14.559812+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:15.559995+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1249716 data_alloc: 218103808 data_used: 8503296
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9b12000/0x0/0x4ffc00000, data 0x168c66e/0x174a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:16.560110+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:17.560273+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115769344 unmapped: 24035328 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.381628036s of 18.489994049s, submitted: 27
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:18.560434+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119668736 unmapped: 20135936 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:19.560568+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:20.560781+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326434 data_alloc: 218103808 data_used: 8761344
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:21.560969+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9225000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:22.561096+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:23.561262+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9225000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:24.561456+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:25.561623+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326434 data_alloc: 218103808 data_used: 8761344
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:26.561775+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:27.561939+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:28.562116+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9225000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:29.562250+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:30.562455+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1326434 data_alloc: 218103808 data_used: 8761344
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:31.562617+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:32.562765+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 119840768 unmapped: 19963904 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:33.562900+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.395916939s of 15.577631950s, submitted: 77
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892d400 session 0x55fb26c2a5a0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9234000/0x0/0x4ffc00000, data 0x1f6a66e/0x2028000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2892c800
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 118964224 unmapped: 20840448 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892c800 session 0x55fb25e2cb40
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:34.563028+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:35.563194+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:36.563309+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:37.563427+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:38.563548+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:39.563691+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:40.563863+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:41.563993+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:42.564139+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2435d000 session 0x55fb26c192c0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:43.564317+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:44.564455+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:45.564618+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:46.564823+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:47.565037+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:48.565235+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:49.565361+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:50.565547+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:51.565674+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:52.565816+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:53.565988+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:54.566267+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:55.566766+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:56.566963+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:57.567125+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:58.567348+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:15:59.567540+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:00.567730+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:01.567903+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:02.568067+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:03.568222+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:04.568664+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:05.568968+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193304 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:06.569263+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25fd8000 session 0x55fb26f38000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:07.569622+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:08.569953+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115802112 unmapped: 24002560 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:09.570128+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 35.909954071s of 36.011264801s, submitted: 36
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:10.570366+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:11.570532+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:12.570807+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:13.570999+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:14.571214+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:15.571413+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:16.571587+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:17.571762+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:18.571914+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115818496 unmapped: 23986176 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:19.572068+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:20.572304+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:21.572462+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:22.572651+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:23.572883+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:24.573047+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:25.573221+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1193172 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:26.573384+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:27.573531+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115826688 unmapped: 23977984 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:28.573733+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:29.573909+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9fae000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2892dc00 session 0x55fb26da45a0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:30.574101+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 21.256072998s of 21.260541916s, submitted: 1
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:31.574263+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:32.574425+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:33.574545+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:34.574697+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:35.574861+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:36.575010+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:37.575245+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:38.575539+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:39.575696+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:40.575886+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb2435d000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.078483582s of 10.083856583s, submitted: 1
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192880 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:41.576011+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:42.576201+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:43.576358+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115834880 unmapped: 23969792 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:44.576491+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:45.576643+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192896 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:46.576788+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:47.576934+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:48.577081+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:49.577235+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:50.577432+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192896 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115843072 unmapped: 23961600 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:51.577632+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:52.577775+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:53.577912+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:54.578055+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:55.578196+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192896 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 23953408 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:56.578322+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115810304 unmapped: 23994368 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:57.578439+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.716539383s of 16.728366852s, submitted: 3
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'config diff' '{prefix=config diff}'
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'config show' '{prefix=config show}'
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'counter dump' '{prefix=counter dump}'
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'counter schema' '{prefix=counter schema}'
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115564544 unmapped: 24240128 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:58.578602+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115302400 unmapped: 24502272 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:16:59.578804+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115712000 unmapped: 24092672 heap: 139804672 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:00.578973+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'log dump' '{prefix=log dump}'
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'perf dump' '{prefix=perf dump}'
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115605504 unmapped: 35241984 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:01.579124+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'perf schema' '{prefix=perf schema}'
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:02.579267+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:03.579410+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:04.579551+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:05.579688+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:06.579853+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:07.580005+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:08.580204+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:09.580415+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:10.580609+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:11.580799+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:12.580963+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:13.581210+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:14.581388+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:15.581529+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:16.581668+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:17.581811+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:18.582675+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:19.582837+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:20.583137+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:21.583497+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:22.583688+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115097600 unmapped: 35749888 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:23.583906+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:24.584196+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:25.584391+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:26.584556+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:27.584754+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:28.584911+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:29.585060+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:30.585297+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:31.585564+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:32.585875+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:33.586121+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115105792 unmapped: 35741696 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:34.586291+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:35.586629+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:36.587111+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:37.587306+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:38.587446+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:39.587607+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:40.587883+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:41.588035+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:42.588360+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115113984 unmapped: 35733504 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:43.588579+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:44.588798+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:45.589030+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:46.589201+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:47.589469+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:48.589642+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:49.589830+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:50.590083+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:51.590271+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:52.590448+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:53.590662+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115122176 unmapped: 35725312 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:54.590880+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:55.591065+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:56.591207+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:57.591366+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:58.591524+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:17:59.591715+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:00.591917+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:01.592062+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:02.592209+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:03.592424+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:04.592659+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115130368 unmapped: 35717120 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:05.592846+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:06.593058+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:07.593349+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:08.593636+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:09.593827+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:10.594109+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:11.594261+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:12.594418+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:13.594584+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:14.594792+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:15.594947+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:16.595175+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:17.595403+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:18.596038+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:19.596219+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:20.596486+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:21.596729+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115138560 unmapped: 35708928 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:22.597252+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:23.597410+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:24.597549+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:25.597927+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:26.598127+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:27.598489+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:28.598698+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:29.598945+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:30.599278+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:31.599653+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:32.599999+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:33.600221+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115146752 unmapped: 35700736 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:34.600415+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:35.600623+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:36.600754+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:37.600920+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:38.601100+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:39.601356+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:40.601619+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:41.601780+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:42.601989+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:43.602221+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115154944 unmapped: 35692544 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:44.602423+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:45.602558+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:46.602725+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:47.602869+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:48.603042+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:49.603253+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:50.603374+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:51.603523+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115163136 unmapped: 35684352 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:52.603648+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:53.603790+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:54.603987+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:55.604182+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:56.604355+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:57.604478+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:58.604630+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:18:59.604820+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:00.604985+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:01.605121+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115171328 unmapped: 35676160 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:02.605298+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:03.605428+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:04.605615+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:05.605810+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.1 total, 600.0 interval
                                           Cumulative writes: 10K writes, 41K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s
                                           Cumulative WAL: 10K writes, 2997 syncs, 3.65 writes per sync, written: 0.03 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1703 writes, 5619 keys, 1703 commit groups, 1.0 writes per commit group, ingest: 5.35 MB, 0.01 MB/s
                                           Interval WAL: 1703 writes, 744 syncs, 2.29 writes per sync, written: 0.01 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:06.605963+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:07.606122+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:08.606504+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:09.606705+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:10.606936+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:11.607093+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:12.607279+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:13.607438+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:14.607647+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:15.607779+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:16.607961+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:17.608075+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:18.608240+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:19.608422+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:20.608639+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115187712 unmapped: 35659776 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:21.608835+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:22.609072+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:23.609261+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:24.609435+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:25.609603+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:26.609788+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:27.609919+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:28.610056+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:29.610209+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:30.610388+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115195904 unmapped: 35651584 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:31.610556+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:32.610726+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:33.610948+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:34.611091+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:35.611218+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:36.611356+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:37.611530+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:38.611704+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:39.611891+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:40.612093+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:41.612293+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:42.612443+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:43.612596+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:44.612794+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:45.612974+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:46.613106+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:47.613311+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:48.613461+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:49.613633+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115212288 unmapped: 35635200 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:50.614718+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:51.615318+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:52.615598+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:53.616360+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:54.616888+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:55.617297+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:56.617526+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:57.618114+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:58.618536+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115220480 unmapped: 35627008 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:19:59.618939+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:00.619165+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:01.619519+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:02.619730+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:03.619886+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:04.620217+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:05.620519+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:06.620772+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:07.621054+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115228672 unmapped: 35618816 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:08.621270+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:09.621455+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:10.621713+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:11.621863+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:12.622129+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:13.622405+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:14.622575+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:15.622810+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:16.623004+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115236864 unmapped: 35610624 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:17.623190+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:18.623327+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:19.623473+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:20.623685+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:21.623834+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:22.623996+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:23.624128+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:24.624359+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:25.624551+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:26.624716+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:27.624914+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:28.625132+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115245056 unmapped: 35602432 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:29.625312+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:30.625517+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:31.625678+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:32.625834+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:33.626003+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:34.626270+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:35.626510+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:36.626674+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115253248 unmapped: 35594240 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:37.626835+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115261440 unmapped: 35586048 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:38.626978+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115261440 unmapped: 35586048 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:39.627204+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115261440 unmapped: 35586048 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:40.627420+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115261440 unmapped: 35586048 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:41.627585+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115261440 unmapped: 35586048 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:42.627796+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115261440 unmapped: 35586048 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:43.628213+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115261440 unmapped: 35586048 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:44.628384+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114933760 unmapped: 35913728 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:45.628557+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:46.628723+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:47.628891+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:48.629103+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:49.629294+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:50.629553+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:51.629729+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:52.629981+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:53.630170+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:54.630396+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:55.630599+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:56.630756+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114941952 unmapped: 35905536 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:57.630896+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:58.631052+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:20:59.631261+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:00.631481+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:01.631683+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:02.631927+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:03.632130+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:04.632328+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:05.632523+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114950144 unmapped: 35897344 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:06.632723+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:07.632936+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:08.633233+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:09.633419+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:10.633681+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:11.633913+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:12.634094+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:13.634319+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114958336 unmapped: 35889152 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:14.634515+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:15.634695+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:16.634932+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:17.635177+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:18.635375+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:19.635590+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:20.635838+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:21.636059+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:22.636228+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:23.636380+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114966528 unmapped: 35880960 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:24.636564+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114974720 unmapped: 35872768 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:25.636756+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114974720 unmapped: 35872768 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:26.637103+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114974720 unmapped: 35872768 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192764 data_alloc: 218103808 data_used: 4792320
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:27.637402+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114974720 unmapped: 35872768 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:28.637566+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114974720 unmapped: 35872768 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:29.637770+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114974720 unmapped: 35872768 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:30.638092+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 272.946929932s of 272.950836182s, submitted: 1
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 114982912 unmapped: 35864576 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:31.638270+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115032064 unmapped: 35815424 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:32.638430+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115089408 unmapped: 35758080 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:33.638587+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115179520 unmapped: 35667968 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:34.638795+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115204096 unmapped: 35643392 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:35.638956+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:36.639198+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:37.639415+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:38.639623+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:39.639797+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:40.640073+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:41.640267+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:42.640482+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:43.640691+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:44.640913+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:45.641301+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:46.641574+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:47.641802+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:48.642028+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:49.642228+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:50.642514+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:51.642773+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:52.643041+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:53.643293+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:54.643575+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:55.643834+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:56.644088+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:57.644404+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:58.644707+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:21:59.644894+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:00.645274+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115310592 unmapped: 35536896 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:01.645512+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115318784 unmapped: 35528704 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:02.645772+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115326976 unmapped: 35520512 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:03.646066+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:04.646511+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:05.646756+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:06.646950+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:07.647163+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:08.647427+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:09.647655+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:10.647960+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:11.648185+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:12.648360+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:13.648600+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:14.648841+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:15.649096+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:16.649359+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:17.649578+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:18.649884+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:19.650132+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:20.650444+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:21.650680+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:22.650926+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:23.651203+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:24.651468+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115335168 unmapped: 35512320 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:25.651839+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115343360 unmapped: 35504128 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:26.652056+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115343360 unmapped: 35504128 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:27.652256+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:28.652499+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:29.652721+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:30.653063+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:31.653448+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:32.653843+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:33.654200+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:34.654507+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:35.654808+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115351552 unmapped: 35495936 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:36.655115+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:37.655390+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:38.655626+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:39.655878+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:40.656224+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:41.656483+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:42.656828+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:43.657075+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115359744 unmapped: 35487744 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:44.657311+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:45.657517+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:46.657786+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:47.658040+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:48.658253+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:49.658438+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:50.658665+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:51.658878+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:52.659109+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115367936 unmapped: 35479552 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:53.659391+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:54.659605+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:55.659840+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:56.660107+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:57.660465+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:58.660780+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:22:59.661048+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:00.661453+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:01.661718+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115376128 unmapped: 35471360 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:02.662040+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:03.662324+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:04.662567+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:05.662844+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:06.663236+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:07.663490+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:08.663760+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:09.663990+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115384320 unmapped: 35463168 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:10.664243+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:11.664454+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:12.664665+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:13.664885+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:14.665104+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:15.665329+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:16.665504+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:17.665655+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:18.665819+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:19.665973+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:20.666224+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:21.666369+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115392512 unmapped: 35454976 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:22.666541+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:23.666696+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:24.666827+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:25.666956+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:26.667254+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:27.667517+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:28.667754+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:29.667923+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:30.668179+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:31.668342+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:32.668522+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:33.668775+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115400704 unmapped: 35446784 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:34.668939+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:35.669123+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:36.669343+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:37.669479+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:38.669675+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:39.669801+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:40.669968+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:41.670129+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115408896 unmapped: 35438592 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:42.670373+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:43.670510+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:44.670687+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:45.670855+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:46.671029+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:47.671200+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:48.671331+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:49.671477+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:50.671656+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:51.671813+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115417088 unmapped: 35430400 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:52.671950+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115425280 unmapped: 35422208 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:53.672133+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115425280 unmapped: 35422208 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:54.672344+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115425280 unmapped: 35422208 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:55.672510+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115425280 unmapped: 35422208 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:56.672626+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115425280 unmapped: 35422208 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:57.672857+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:58.673080+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115425280 unmapped: 35422208 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:23:59.673246+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:00.673449+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:01.673602+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:02.673777+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:03.673958+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:04.674370+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:05.674543+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets getting new tickets!
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:06.675020+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _finish_auth 0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:06.677373+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:07.675244+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:08.675438+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:09.675663+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:10.675875+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:11.675998+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115433472 unmapped: 35414016 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:12.676195+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:13.676367+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:14.676654+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:15.676774+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:16.676929+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:17.677192+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:18.677381+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:19.677513+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115441664 unmapped: 35405824 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:20.677738+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:21.677907+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:22.678105+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:23.678223+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:24.678371+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:25.678538+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:26.678725+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:27.678870+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:28.679044+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:29.679259+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:30.679542+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:31.679699+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:32.679871+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:33.680022+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:34.680222+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:35.680393+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:36.680542+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:37.680708+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 35397632 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:38.680942+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:39.681107+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:40.681321+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:41.681510+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:42.681705+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:43.681874+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:44.682120+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:45.682353+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:46.682526+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:47.682695+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:48.682830+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:49.683076+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:50.686914+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:51.687083+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:52.687272+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115458048 unmapped: 35389440 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:53.687474+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:54.687681+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:55.687925+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:56.688092+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:57.688254+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:58.688398+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:24:59.688569+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:00.688781+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:01.688916+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:02.689132+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:03.689384+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:04.689593+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:05.689788+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:06.689978+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:07.690184+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:08.690497+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:09.690671+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:10.690938+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115466240 unmapped: 35381248 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:11.691211+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:12.691400+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:13.691597+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:14.691772+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:15.691961+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:16.692194+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:17.692430+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:18.692587+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:19.692814+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:20.693065+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:21.693246+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:22.693411+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:23.693572+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:24.693725+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:25.693924+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:26.694061+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115474432 unmapped: 35373056 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:27.694275+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:28.694431+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:29.694622+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:30.694826+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:31.694984+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:32.695171+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:33.695305+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:34.695469+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:35.695653+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:36.695802+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:37.696013+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:38.696237+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:39.696429+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:40.696648+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:41.696865+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:42.697100+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:43.697387+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:44.697629+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:45.697789+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:46.697952+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:47.698134+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:48.698352+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:49.698525+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:50.698738+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:51.698885+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:52.705372+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:53.705520+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:54.705706+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:55.705896+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:56.706056+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:57.706226+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:58.706411+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:25:59.706616+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:00.706908+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:01.707238+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:02.707462+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:03.707752+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:04.707966+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:05.708208+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:06.708373+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:07.708521+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:08.708650+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:09.708826+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:10.709030+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:11.709179+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:12.709328+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:13.709485+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:14.709677+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:15.709869+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:16.710022+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:17.710263+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:18.710475+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:19.710656+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:20.710838+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:21.711042+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb25e58c00 session 0x55fb26c18d20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25fd8000
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:22.711255+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb235d5c00 session 0x55fb26f51e00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb25e58c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:23.711475+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb2417cc00 session 0x55fb26f51680
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24154c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:24.711676+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:25.711880+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:26.712116+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 ms_handle_reset con 0x55fb261b7800 session 0x55fb26c19c20
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: handle_auth_request added challenge on 0x55fb24155c00
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115490816 unmapped: 35356672 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:27.712357+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:28.712504+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:29.712705+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:30.712927+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:31.713087+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:32.713322+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:33.713490+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:34.713652+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:35.713798+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:36.714064+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:37.714236+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:38.714391+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:39.714523+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:40.714820+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:41.714969+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:42.715126+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:43.715287+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:44.715456+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:45.715617+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:46.715817+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:47.715970+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:48.716105+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:49.716290+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:50.716511+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:51.716670+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:52.716856+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:53.717050+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:54.717247+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:55.717388+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:56.717517+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115499008 unmapped: 35348480 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:57.717665+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'config diff' '{prefix=config diff}'
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'config show' '{prefix=config show}'
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115859456 unmapped: 34988032 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'counter dump' '{prefix=counter dump}'
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:58.717799+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'counter schema' '{prefix=counter schema}'
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Dec 06 10:27:31 compute-1 ceph-osd[77465]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Dec 06 10:27:31 compute-1 ceph-osd[77465]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192748 data_alloc: 218103808 data_used: 4796416
Dec 06 10:27:31 compute-1 ceph-osd[77465]: osd.0 154 heartbeat osd_stat(store_statfs(0x4f9faf000/0x0/0x4ffc00000, data 0x11f064b/0x12ad000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115482624 unmapped: 35364864 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:26:59.717927+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: prioritycache tune_memory target: 4294967296 mapped: 115851264 unmapped: 34996224 heap: 150847488 old mem: 2845415833 new mem: 2845415833
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: tick
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_tickets
Dec 06 10:27:31 compute-1 ceph-osd[77465]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2025-12-06T10:27:00.718095+0000)
Dec 06 10:27:31 compute-1 ceph-osd[77465]: do_command 'log dump' '{prefix=log dump}'
Dec 06 10:27:31 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Dec 06 10:27:31 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1485346110' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:27:31 compute-1 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 06 10:27:31 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Dec 06 10:27:31 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/795191216' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:27:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:27:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:27:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:31 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:27:32 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:32 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:27:32 compute-1 ceph-mon[79770]: from='client.28342 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:32 compute-1 ceph-mon[79770]: from='client.18924 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:32 compute-1 ceph-mon[79770]: from='client.26987 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:32 compute-1 ceph-mon[79770]: from='client.28369 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:32 compute-1 ceph-mon[79770]: from='client.18942 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:32 compute-1 ceph-mon[79770]: pgmap v1395: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:27:32 compute-1 ceph-mon[79770]: from='client.27002 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:32 compute-1 ceph-mon[79770]: from='client.18963 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:32 compute-1 ceph-mon[79770]: from='client.28390 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:32 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4096137834' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:27:32 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1485346110' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Dec 06 10:27:32 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/435042096' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:27:32 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/544280558' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:27:32 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/795191216' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Dec 06 10:27:32 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3054406261' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:27:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Dec 06 10:27:32 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3485379357' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:27:32 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Dec 06 10:27:32 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/568451492' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 06 10:27:32 compute-1 crontab[251477]: (root) LIST (root)
Dec 06 10:27:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:27:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:32.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:27:32 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:32 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:32 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:32.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:33 compute-1 ceph-mon[79770]: from='client.27017 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:33 compute-1 ceph-mon[79770]: from='client.18978 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:33 compute-1 ceph-mon[79770]: from='client.28402 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:33 compute-1 ceph-mon[79770]: from='client.27038 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:33 compute-1 ceph-mon[79770]: from='client.18999 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:33 compute-1 ceph-mon[79770]: from='client.28411 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:33 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2496513314' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 06 10:27:33 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3485379357' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Dec 06 10:27:33 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/390884467' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 06 10:27:33 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/568451492' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Dec 06 10:27:33 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1616872659' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 06 10:27:33 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Dec 06 10:27:33 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1892291010' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Dec 06 10:27:34 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2167965242' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.27056 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.19020 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.19026 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.27077 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: pgmap v1396: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.19047 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.28444 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.27083 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1154383948' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.19068 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3251340584' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1045814600' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1892291010' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3432369359' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2986543317' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2167965242' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/772093391' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Dec 06 10:27:34 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/167369165' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Dec 06 10:27:34 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1798232326' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 06 10:27:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:34.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:34 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:34 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:34 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:34.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Dec 06 10:27:34 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2723227824' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 06 10:27:34 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Dec 06 10:27:34 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3073635263' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.28468 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.27101 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.19080 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.28492 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.19095 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.27119 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2183060927' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3294404057' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3984842228' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/167369165' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1798232326' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4024216677' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3501769992' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2266947748' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: pgmap v1397: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2723227824' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2863449111' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3073635263' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2632608971' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/120051130' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2031727652' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Dec 06 10:27:35 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1172937656' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Dec 06 10:27:35 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2216123752' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Dec 06 10:27:35 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3541376245' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 06 10:27:35 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Dec 06 10:27:35 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1407133659' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Dec 06 10:27:36 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3862076468' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 06 10:27:36 compute-1 systemd[1]: Starting Hostname Service...
Dec 06 10:27:36 compute-1 systemd[1]: Started Hostname Service.
Dec 06 10:27:36 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Dec 06 10:27:36 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4109644866' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1172937656' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3465320604' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3652493996' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2216123752' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/832528068' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/283759111' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3541376245' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4281400625' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2213913475' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1407133659' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2487044674' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3162727959' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3862076468' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3200042151' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3329454876' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 06 10:27:36 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Dec 06 10:27:36 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1301683355' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 06 10:27:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:27:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:36.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:27:36 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Dec 06 10:27:36 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/475292124' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 06 10:27:36 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:36 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:36 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:36.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:36 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Dec 06 10:27:36 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1883974811' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 06 10:27:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:36 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:27:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:27:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:27:37 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:37 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:27:37 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Dec 06 10:27:37 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/923492754' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 06 10:27:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/4109644866' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Dec 06 10:27:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2938496044' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 06 10:27:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4236343578' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 06 10:27:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1301683355' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Dec 06 10:27:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1352424821' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 06 10:27:37 compute-1 ceph-mon[79770]: from='client.28621 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:37 compute-1 ceph-mon[79770]: from='client.19245 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:37 compute-1 ceph-mon[79770]: pgmap v1398: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:27:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/475292124' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Dec 06 10:27:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1883974811' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Dec 06 10:27:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1546404607' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 06 10:27:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2637026852' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 06 10:27:37 compute-1 ceph-mon[79770]: from='client.28639 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:37 compute-1 ceph-mon[79770]: from='client.19260 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:37 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/923492754' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Dec 06 10:27:38 compute-1 sudo[252299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Dec 06 10:27:38 compute-1 sudo[252299]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Dec 06 10:27:38 compute-1 sudo[252299]: pam_unix(sudo:session): session closed for user root
Dec 06 10:27:38 compute-1 ceph-mon[79770]: from='client.28648 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:38 compute-1 ceph-mon[79770]: from='client.27251 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:38 compute-1 ceph-mon[79770]: from='client.19266 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:38 compute-1 ceph-mon[79770]: from='client.27257 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:38 compute-1 ceph-mon[79770]: from='client.28666 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:38 compute-1 ceph-mon[79770]: from='client.27269 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:38 compute-1 ceph-mon[79770]: from='client.19287 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/862610040' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 06 10:27:38 compute-1 ceph-mon[79770]: from='client.27281 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:38 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/1722450257' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 06 10:27:38 compute-1 ceph-mon[79770]: from='client.28684 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:38 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Dec 06 10:27:38 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/897535951' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 06 10:27:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:38.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:38 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:38 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:38 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:38.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Dec 06 10:27:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3518358618' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec 06 10:27:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2997084197' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: from='client.19305 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: from='client.27296 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/4053778090' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: from='client.28705 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2754842451' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/897535951' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: pgmap v1399: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Dec 06 10:27:39 compute-1 ceph-mon[79770]: from='client.19323 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: from='client.27311 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/705246239' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: from='client.28723 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: from='mgr.14652 192.168.122.100:0/2181988963' entity='mgr.compute-0.qhdjwa' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3336786673' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/3518358618' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2997084197' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 06 10:27:39 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Dec 06 10:27:39 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/458820190' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Dec 06 10:27:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1206616175' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='client.19338 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='client.19341 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='client.28741 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2979283974' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='client.19365 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/458820190' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='client.27341 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='client.28771 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/1206616175' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='client.19407 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='client.27356 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:27:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:40.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:40 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:40 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:40 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:40.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:27:40 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:27:41 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:27:41 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:27:41 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Dec 06 10:27:41 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2114392114' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 06 10:27:41 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:27:41 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:27:41 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1960310445' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 06 10:27:41 compute-1 ceph-mon[79770]: pgmap v1400: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:27:41 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:27:41 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:27:41 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2258838112' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 06 10:27:41 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Dec 06 10:27:41 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Dec 06 10:27:41 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Dec 06 10:27:41 compute-1 ceph-mon[79770]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Dec 06 10:27:41 compute-1 ceph-mon[79770]: from='client.28891 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:41 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2114392114' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Dec 06 10:27:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Dec 06 10:27:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Dec 06 10:27:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:41 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Dec 06 10:27:42 compute-1 ceph-5ecd3f74-dade-5fc4-92ce-8950ae424258-nfs-cephfs-0-0-compute-1-djsnbu[238108]: 06/12/2025 10:27:42 : epoch 693402a6 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Dec 06 10:27:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Dec 06 10:27:42 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/219266435' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 06 10:27:42 compute-1 ceph-mon[79770]: from='client.19473 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1936562756' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 06 10:27:42 compute-1 ceph-mon[79770]: from='client.27455 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2577467961' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 06 10:27:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/2590202385' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 06 10:27:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/219266435' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Dec 06 10:27:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/3419239298' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 06 10:27:42 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/617292848' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 06 10:27:42 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Dec 06 10:27:42 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/492615239' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 06 10:27:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:27:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:42.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:27:42 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:42 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.000000000s ======
Dec 06 10:27:42 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:42.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Dec 06 10:27:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Dec 06 10:27:43 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2841293906' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 06 10:27:43 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Dec 06 10:27:43 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4035985611' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 06 10:27:43 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/492615239' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Dec 06 10:27:43 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/755680862' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 06 10:27:43 compute-1 ceph-mon[79770]: pgmap v1401: 337 pgs: 337 active+clean; 41 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Dec 06 10:27:43 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/1338248061' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 06 10:27:43 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2841293906' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Dec 06 10:27:43 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/305757430' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 06 10:27:43 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/4035985611' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Dec 06 10:27:44 compute-1 ceph-mon[79770]: mon.compute-1@2(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Dec 06 10:27:44 compute-1 ceph-mon[79770]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Dec 06 10:27:44 compute-1 ceph-mon[79770]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2324197860' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 06 10:27:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383cf1f5d0 =====
Dec 06 10:27:44 compute-1 radosgw[83354]: ====== starting new request req=0x7f383ce9e5d0 =====
Dec 06 10:27:44 compute-1 radosgw[83354]: ====== req done req=0x7f383ce9e5d0 op status=0 http_status=200 latency=0.001000025s ======
Dec 06 10:27:44 compute-1 radosgw[83354]: beast: 0x7f383ce9e5d0: 192.168.122.100 - anonymous [06/Dec/2025:10:27:44.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Dec 06 10:27:44 compute-1 radosgw[83354]: ====== req done req=0x7f383cf1f5d0 op status=0 http_status=200 latency=0.002000050s ======
Dec 06 10:27:44 compute-1 radosgw[83354]: beast: 0x7f383cf1f5d0: 192.168.122.102 - anonymous [06/Dec/2025:10:27:44.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000050s
Dec 06 10:27:44 compute-1 ceph-mon[79770]: from='client.28948 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:44 compute-1 ceph-mon[79770]: from='client.19518 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3124525284' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 06 10:27:44 compute-1 ceph-mon[79770]: from='client.27485 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Dec 06 10:27:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2890381324' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 06 10:27:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.102:0/3971778577' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 06 10:27:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.101:0/2324197860' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Dec 06 10:27:44 compute-1 ceph-mon[79770]: from='client.? 192.168.122.100:0/2166653876' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Dec 06 10:27:44 compute-1 podman[253106]: 2025-12-06 10:27:44.884727942 +0000 UTC m=+0.177520703 container health_status 00ce4a06fb8121e4e606908b6e32f0235411ef43da2f30b2d061840227db729c (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
